Search Results for: military

On Truth and Lying in the Extra German Sense

Illustration by Homestead

Rebecca Schuman | Longreads | June 2019 | 15 minutes (3,962 words)

Would you like to know if you’ve gained weight? If you’re annoying, or too talkative, or not as smart as you think? If you’re doing something, literally anything, the wrong way? Just ask a German and they will tell you immediately. Germans do not do this to hurt your feelings. There isn’t even a single long word in German for “hurt feelings,” they just translate the English directly (verletzte Gefühle), and everyone knows that direct translation from the English is how Germans demonstrate their disdain. There is, however, a common and beloved expression for an individual who makes a big show of having hurt feelings, and that is beleidigte Leberwurst, or a perennially “insulted liver sausage,” because hurt fee-fees are for weak non-German babies.

After all, Germans are just being direct: unmittelbar, or literally translated, “unmediated.” Their assertions are simply unverblümt, or “not putting a flower on it.” They’re not mean, they’re freimütig, or “free-hearted.” They’re just being forthright: offen, “open,” which is a good thing, ja? Germans couldn’t even begin to imagine why being brutally honest would hurt someone in the first place! If the truth hurts you, isn’t that more your fault than the truth’s?

Read more…

Remembering Dr. John

Ronald C. Modra/Sports Imagery/Getty Images

The first Dr. John died in August 1885. He was known by many names, as New Orleans chronicler Lafcadio Hearn noted in his obituary.

“Jean Montanet, or Jean La Ficelle, or Jean Latanié, or Jean Racine, or Jean Grisgris, or Jean Macaque, or Jean Bayou, or ‘Voudoo John,’ or ‘Bayou John,’ or ‘Doctor John’ might well have been termed ‘The Last of the Voudoos,’” Hearn wrote for Harper’s Weekly that November, “not that the strange association with which he was affiliated has ceased to exist with his death, but that he was the last really important figure of a long line of wizards or witches whose African titles were recognized, and who exercised an influence over the colored population.”

The second Dr. John just died on June 6, 2019. In a way he, too, was a wizard — at least in the sense that anything done wonderfully well cannot be told from magic. This latter Dr. John was also associated with New Orleans and exercised his own influence as a singer, songwriter, and musician.

Born as Malcolm John Rebennack Jr., Dr. John was part of the third wave of influence — first jazz, then rock, and then funk — to emerge from the Crescent City, a place more responsible for American popular music than any other. His career took off while he was in exile, trying to preserve the music he grew up with. It ended with the world acknowledging his efforts to broaden our vocabulary, musically and otherwise.

“I been in the right trip,” he once sang — a line written for him by Bob Dylan, “but I must have used the wrong car.”

Born on November 20, 1941, “Mac” Rebennack grew up attending gigs and recording sessions with his music aficionado father, who turned him on to New Orleans jazz greats King Oliver and Louis Armstrong.

“Well, my father’s records were what they called ‘race records,’ which was blues, rhythm and blues, traditional jazz, and gospel,” Rebennack told Smithsonian Magazine in 2009. “He owned a record shop and had a large black clientele. They would come by and play a record to decide if they liked it. I got the idea as a little kid that I wanted to be a piano player, because I remember hearing [boogie-woogie pianist] Pete Johnson. I thought why not just be Pete Johnson?”

Fats Domino’s guitarists taught the young Rebennack some stuff. Meeting the great New Orleans pianist Professor Longhair inspired him to become a professional musician. Rebennack was present when Little Richard cut “Tutti Frutti” at Cosimo Matassa’s J&M Music Shop and Studio on North Rampart street. By the early 1960s, he was playing professionally, doing session work for such local luminaries as Art Neville and Allen Toussaint. Ace Records made him an A&R man at the age of 16.

By this time, Rebennack was also hooked on heroin and subsequently busted for possession. After his release from prison in 1965, he returned to a different world. It was already more difficult to play in mixed groups. “When the civil rights movement heated up, it became more dangerous to travel as part of these package shows,” he remembered. “Before then, we used to travel all over the South with no problem — me, Earl King, Guitar Slim, Chuck Berry, people like that — but then suddenly, we started getting hassled.”

Moreover, New Orleans was trying to clean up its seedy image, and many of its music venues, according to Rebennack, were “buckets of blood joints. It was not a wholesome atmosphere where you could bring your family along. There were gang fights. The security and the police would fire guns into the crowd. … Later [New Orleans District Attorney] Jim Garrison padlocked and shut down the whole music scene.” It was time to go.

Rebennack moved to Los Angeles, where he was soon playing sessions with Aretha Franklin, Bob Dylan, and Frank Zappa. “They recruited about half of New Orleans one time to go out and do The Sonny and Cher Show,” remembered Rebennack’s friend Coco Robicheaux. “They were all out there doin’ that, and Sonny was always after [Rebennack], ‘Man, I got a state-of-the-art studio, it’s there for you any time you want it. Y’all just lay around here, why don’tcha go do somethin’?”

Rebennack had an idea about a character someone could play, based on Jean Montanet. But he didn’t want to be Dr. John. He wanted his singer friend Ronnie Barron to do it. “I was never fond of front men,” Rebennack told the Smithsonian. “I didn’t want to be one.”

Barron was the reason Rebennack switched from guitar to piano. Years before, at a gig in Jacksonville, Florida, Barron was being pistol-whipped. “Ronnie was just a kid and his mother had told me, ‘You better look out for my son,’” Rebennack remembered. “Oh god, that was all I was thinking about. I tried to stop the guy, I had my hand over the barrel and he shot.”

“It just went right through my finger,” Rebennack said. “And my finger was hanging by a piece of skin. … They put it back on in the hospital and they sewed it back on very poorly and it never did work right.” When asked how he was able to play piano with a crooked finger, Rebennack quipped, “I try to avoid that finger when I play the piano.”

Barron was also responsible for creating a stage persona early on that inspired Rebennack.

“”I met Mac Rebennack when I was 15.” Barron once said.

I’d been aware of him since I was 12, and he had a good working band that played on the west side where I lived, in Algiers. New Orleans was a real fly-by-night town, where there was a big tourist crowd and people wanted to drink. They didn’t care about the music that much, just wanted to be entertained. So I created my “Reverend Ether” character, almost by accident. I made up this mythology about the voodoo and the gumbo. I’d shake the tambourine and say, “I’m gonna drop the truth on you!” I made up all this shit. This was before I worked with Mac, when I was working in a club on Bourbon Street. He’d come in and kind of watch what I was doing. … Mac realized the value in it, and after he hired me he wanted me to be the original Dr. John, because I already had a handle on the thing.

When Barron was hired by Sonny and Cher and moved west, he gave the Reverend Ether character to Rebennack.

Back in Los Angeles, Barron wasn’t interested in adopting Rebennack’s Dr. John persona. “Ronnie was like this good-lookin’ guy, liked to wear suits, he didn’t want to be no swamp thing,” Robicheaux said. “So they talked Mac into doin’ it. ‘You be Dr. John.’ And everybody loved it.”

Rebennack’s conga player told him, “Look, if Bob Dylan and Sonny and Cher can do it, you can do it.” And so Dr. John was returned to earth and put on a mission.

“I did my first record,” Rebennack said, “to keep New Orleans gris-gris alive.”

The first Dr. John was also a gris-gris man. According to Lafcadio Hearn, Jean Montanet claimed to be a prince’s son from Senegal, of the free-born Bambara tribe. As a youth, he was kidnapped by Spanish slavers. Given back his freedom, he traveled the world as a ship’s cook, finally settling in New Orleans. He became wealthy through fortune-telling and the folk magic practices that we now know as rootwork and hoodoo.

“By-and-by his reputation became so great that he was able to demand and obtain immense fees,” Hearn wrote. “People of both races and both sexes thronged to see him — many coming even from far-away creole towns in the parishes, and well-dressed women, closely veiled, often knocked at his door.” Before long, Montanet was worth $50,000 — enormous wealth for the mid-19th century.

The gris-gris originated in West Africa, and Montanet brought the practice with him. It takes the form of a fetish, carried by the user, for protection or benefit. They are often composed of an uneven number of bones, colored objects and stones, graveyard soil, salt, and other exotic ingredients such as bird nests. Gris-gris culture was already a part of Louisiana voodoo, brought to the state by enslaved West Africans, where it syncretized with elements of Catholicism. Hearn, a white man, described Montanet’s religion as “primitive in the extreme.”

If during his years of servitude in a Catholic colony he had imbibed some notions of Romish Christianity, it is certain at least that the Christian ideas were always subordinated to the African — just as the image of the Virgin Mary was used by him merely as an auxiliary fetich in his witchcraft, and was considered as possessing much less power than the “elephant’s toof.” He was in many respects a humbug; but he may have sincerely believed in the efficacy of certain superstitious rites of his own.

Rebennack had his own “notions of Romish Christianity”: He attended New Orleans’s Jesuit High School until kicked out for his musical preoccupations. Other forces connected him to Jean Montanet. “There was a guy the name of Dr. John, a hoodoo guy in New Orleans,” Rebennack once said. “He was competition to Marie Laveau. He was like her opposite. I actually got a clipping from the Times-Picayune newspaper about how my great-great-great-grandpa Wayne was busted with this guy for running a voodoo operation in a whorehouse in 1860. I decided I would produce the record with this as a concept.”

That record was 1968’s atmospheric, ominous, and thoroughly funky Gris-Gris. “One thing I always did was believe,” Rebennack told Mojo magazine. “I used to play for gigs for the Gris-Gris church. I dug the music, and that’s what I was trying to capture.”

“They call me Dr. John, known as the Night Tripper,” he sings on “Gris-Gris Gumbo Ya Ya,” in a raspy voice predictive of Tom Waits. (Rebennack once told a New Orleans paper, “I’m tripping through the shortcuts of existment to feel it and that’s good.”)

Got my satchel of gris gris in my hand

Day trippin’ up, back down the bayou

I’m the last of the best

They call me the gris gris man

“I always thought [voodoo] was a beautiful part of New Orleans culture,” Rebennack once said. “It’s such a blend of stuff; African, Choctaw, Christianity, Spanish.” He told the Smithsonian that he’d approached “some of the reverend mothers” and asked if he could perform the sacred songs. “But I couldn’t do them because it was not for a ceremony,” he said. “So I wrote something similar. One we used went ‘corn boule killy caw caw, walk on gilded splinters.’ It actually translates to ‘cornbread, coffee, and molasses’ in old Creole dialect.”

“It’s supposed to be ‘spendors’ but I turned it into ‘splinters,’” Rebennack remembered. “I just thought splinters sounded better and I always pictured splinters when I sung it.”

Coco Robicheaux had a more complex take. “Dr. John, he was very much interested in metaphysics. We had this little place on St. Philip Street. In voodoo they call the gilded splinters the points of a planet. Mystically they appear like little gilded splinters, like little gold, like fire that holds still. They’re different strengths at different times. I guess it ties in with astrology, and influence the energy. That’s what that’s about.”

Gris-Gris didn’t do that well commercially. “What is this record you gave me?” asked Rebennack’s label boss. “Why didn’t you give me a record that we could sell?” Still, the new Dr. John created a cult following by doubling down on the hoodoo visuals. He would appear onstage in a puff of smoke, decked in feathers (or merely body paint), robes, and headdresses. For a while, one of his opening acts was someone named Prince Kiyama, who would bite the heads off live chickens and drink the blood. Sometimes his backup dancers were nude.

It should go without saying that the new Dr. John’s act had as much to do with voodoo as David Seville’s 1958 hit “Witch Doctor” did to African shamanism, which is to say, not at all. When questioned about his Dr. John stage show later in life, Rebennack insisted that “it was very authentic,” and compared the abandonment of his dancers to “things that might happen in voodoo, where they’re taken by a spirit.” It seems more like the act was designed to appeal to his young, libertine audience rather than be an avenue of understanding a different, complex belief system. At any rate, he retired all that by 1976, when Rebennack appeared at The Band’s farewell concert (later immortalized in Martin Scorcese’s documentary The Last Waltz) to sing the charming, if not entirely wholesome, “Such a Night.“

America has always had two prominent cultures: the colonial and the communal. The colonial culture mimics or appropriates the voice of the underclass, manifesting itself in minstrelsy and coon songs, and even affecting civil rights–era folk music.

The communal strain of American cultural expression has been just as strong, but more fruitful. Think of Congo Square, the place in New Orleans where the first Dr. John and Marie Laveaux plied their trades. It was here that slaves were allowed to “gather, roughly by tribe, to play music, sing, and dance” in the 18th and 19th centuries. These rhythms, when combined with blues and European modalities and military marching band instruments, became jazz. Nothing like that had existed before. In the same sense, it’s how Louisiana voodoo was created out of a gumbo of multicultural spiritual and religious expressions to become something unique. Through the centuries, we have all gathered roughly by tribe. Sometimes it’s produced magic.

Mac “Dr. John” Rebennack embodied both of these cultures. His hoodoo schtick had a little of the “bone through your nose” stereotypes typified by artists like Screamin’ Jay Hawkins; it didn’t contribute much to cultural understanding beyond a new vocabulary of exotic words and phrases, which he had appropriated largely for effect.

But Rebennack was a musician — and more than that, a New Orleanian — through and through. He learned from black and white people, was shocked when a New Orleans auditorium wouldn’t let his white band back Bo Diddley, and dedicated himself to preserving that rolling, loose-limbed music he believed was dying. Later on, he often recorded with the Meters, the one band that epitomized New Orleans funk. Rebennack also revered his musical ancestors, recording tributes to Professor Longhair, Duke Ellington, and Louis Armstrong, New Orleans’s great ambassador of jazz. “I’m trying to give props to Pops,” Rebennack once said about his Armstrong dedication. “I think we’re all supposed to give props to our elders.”  

***

Tom Maxwell is a writer and musician. He likes how one informs the other.

Editor: Aaron Gilbreath; Fact-checker: Jason Stavers

True Roots

Daniel Berehulak/Getty Images

Ronnie Citron-Fink | True Roots | Island Press | June 2019 | 34 minutes (5.655 words)

 

How’d You do it? Are you doing that on purpose? Are you okay? Ever since I stopped coloring my silver hair, I’ve gotten a lot of questions. One of the most common during my hair transition was Why are you letting it go gray? While my roots didn’t ask permission before they stopped growing in dark brown, it was a complex mix of fear and determination that rearranged my beauty priorities. The question of why — why, after twenty-five years of using chemical dyes, I gave them up-is something I’ve thought about a lot.

My world began to shift four years ago. I was sitting in a meeting about toxics reform in Washington, DC, when an environmental scientist began to describe the buildup of chemicals in our bodies. As she rattled off a list of ingredients in personal care products-toluene, benzophenone, stearates, triclosan — my scalp started to tingle. “We’re just beginning to understand how these chemicals compromise long-term health,” she concluded.

Read more…

The Artificial Intelligence of the Public Intellectual

morkeman / Getty

Soraya Roberts | Longreads | May 2019 | 8 minutes (2,228 words)

“Well, that’s a really important thing to investigate.” While Naomi Wolf’s intellectual side failed her last week, her public side did not. That first line was her measured response when a BBC interviewer pointed out — on live radio — that cursory research had disproven a major thesis in her new book, Outrages: Sex, Censorship, and the Criminalization of Love (she misinterpreted a Victorian legal term, “death recorded,” to mean execution — the term actually meant the person was pardoned). Hearing this go down, journalists like me theorized how we would react in similar circumstances (defenestration) and decried the lack of fact-checkers in publishing (fact: Authors often have to pay for their own). The mistake did, however, ironically, offer one corrective: It turned Wolf from cerebral superhero into mere mortal. No longer was she an otherworldly intellect who could suddenly complete her Ph.D. — abandoned at Oxford when she was a Rhodes Scholar in the mid-’80s, Outrages is a reworking of her second, successful, attempt — while juggling columns for outlets like The Guardian, a speaking circuit, an institute for ethical leadership, and her own site, DailyClout, not to mention a new marriage. Something had to give, and it was the Victorians.

Once, the public intellectual had the deserved reputation of a scholarly individual who steered the public discourse: I always think of Oscar Wilde, the perfect dinner wit who could riff on any subject on command and always had the presence of mind to come up with an immortal line like, “One can survive everything nowadays except death.” The public intellectual now has no time for dinner. Wolf, for instance, parlayed the success of her 1991 book The Beauty Myth into an intellectual career that has spanned three decades, multiple books, and a couple of political advisory jobs, in which time her supposed expertise has spread far beyond third-wave feminism. She has become a symbol of intellectual rigor that spans everything from vaginas to dictatorships — a sort of lifestyle brand for the brain. Other thought leaders like her include Jordan Peterson, Fareed Zakaria, and Jill Abramson. Their minds have hijacked the public trust, each one acting as the pinnacle of intellect, an individual example of brilliance to cut through all the dullness, before sacrificing the very rigor that put them there in order to maintain the illusion floated by the media, by them, even by us. The public intellectual once meant public action, a voice from the outside shifting the inside, but then it became personal, populated by self-serving insiders. The public intellectual thus became an extension — rather than an indictment — of the American Dream, the idea that one person, on their own, can achieve anything, including being the smartest person in the room as well as the richest.

* * *

I accuse the Age of Enlightenment of being indirectly responsible for 12 Rules for Life. The increasingly literate population of the 18th century was primed to live up to the era’s ultimate aspiration: an increasingly informed public. This was a time of debates, public lectures, and publications and fame for the academics behind them. Ralph Waldo Emerson, for one. In his celebrated “The American Scholar” speech from 1837, Emerson provided a framework for an American cultural identity — distinct from Europe’s — which was composed of a multifaceted intellect (the One Man theory). “The scholar is that man who must take up into himself all the ability of the time, all the contributions of the past, all the hopes of the future,” he said. “In yourself slumbers the whole of Reason; it is for you to know all, it is for you to dare all.” While Emerson argued that the intellectual was bound to action, the “public intellectual” really arrived at the end of the 19th century, when French novelist Émile Zola publicly accused the French military of antisemitism over the Dreyfus Affair in an open letter published in  L’Aurore newspaper in 1898. With  “J’Accuse…!,” the social commentary Zola spread through his naturalist novels was transformed into a direct appeal to the public: Observational wisdom became intellectual action. “I have but one passion: to enlighten those who have been kept in the dark, in the name of humanity which has suffered so much and is entitled to happiness,” he wrote. “My fiery protest is simply the cry of my very soul.”

The public intellectual thenceforth became the individual who used scholarship for social justice. But only briefly. After the Second World War, universities opened up to serve those who had served America, which lead to a boost in educated citizens and a captive audience for philosophers and other scholars. By the end of the ’60s, television commanded our attention further with learned debates on The Dick Cavett Show — where autodidact James Baldwin famously dressed down Yale philosopher Paul Weiss — and Firing Line with William F. Buckley Jr. (also famously destroyed by Baldwin), which would go on to host academics like Camille Paglia in the ’90s. But Culture Trip editor Michael Barron dates the “splintering of televised American intellectualism” to a 1968 debate between Gore Vidal — “I want to make 200 million people change their minds,” the “writer-hero” once said — and Buckley, which devolved into playground insults. A decade later, the public intellectual reached its celebrity peak, with Susan Sontag introducing the branded brain in People magazine (“I’m a book junkie. … I buy special editions like other women shop for designer originals at Saks.”)

As television lost patience with Vidal’s verbose bravado, he was replaced with more telegenic — angrier, stupider, more right-wing — white men like Bill O’Reilly, who did not clarify nuance but blustered over the issues of the day; the public intellectual was now all public, no intellect. Which is to say, the celebrity pushed out the scholar, but it was on its way out anyway. By the ’80s, the communal philosophical and political conversations of the post-war era slunk back to the confines of academia, which became increasingly professionalized, specialized, and insular, producing experts with less general and public-facing knowledge. “Anyone who engages in public debate as a scholar is at risk of being labelled not a serious scholar, someone who is diverting their attention and resources away from research and publicly seeking personal aggrandizement,” one professor told University Affairs in 2014. “It discourages people from participating at a time when public issues are more complicated and ethically fraught, more requiring of diverse voices than ever before.” Diversity rarely got past the ivy, with the towering brilliance of trespassers like Baldwin and Zora Neale Hurston, among other marginalized writers, limited by their circumstances. “The white audience does not seek out black public intellectuals to challenge their worldview,” wrote Mychal Denzel Smith in Harper’s last year, “instead they are meant to serve as tour guides through a foreign experience that the white audience wishes to keep at a comfortable distance.”

Speaking of white audiences … here’s where I mention the intellectual dark web even though I would rather not. It’s the place — online, outside the academy, in pseudo-intellectual “free thought” mag Quillette — where reactionary “intellectuals” flash their advanced degrees while claiming their views are too edgy for the schools that graduated them. These are your Petersons, your Sam Harrises, your Ben Shapiros, the white (non)thinkers, usually men, tied in some vague way to academia, which they use to validate their anti-intellectualism while passing their feelings off as philosophy and, worse, as (mis)guides for the misguided. Last month, a hyped debate between psychology professor Peterson and philosopher Slavoj Žižek had the former spending his opening remarks stumbling around Marxism, having only just read The Communist Manifesto for the first time since high school. As Andray Domise wrote in Maclean’s, “The good professor hadn’t done his homework.” But neither have his fans.

But it’s not just the conservative public intellectuals who are slacking off. Earlier this year, Jill Abramson, the former executive editor of The New York Times, published Merchants of Truth: The Business of News and the Fight for Facts. She was the foremost mind on journalism in the Trump era for roughly two seconds before being accused of plagiarizing parts of her book. Her response revealed that the authorship wasn’t exactly hers alone, a fact which only came to light in order for her to blame others for her mistakes. “I did have fact-checking, I did have assistants in research, and in some cases, the drafting of parts of the book,” she told NPR. “I certainly did spend money. But maybe it wasn’t enough.” Abramson’s explanation implied a tradition in which, if you are smart enough to be rich enough, you can pay to uphold your intellectual reputation, no matter how artificial it may be.

That certainly wasn’t the first time a public intellectual overrepresented their abilities. CNN host Fareed Zakaria, a specialist in foreign policy with a Ph.D. from Harvard — a marker of intelligence that can almost stand in for actual acumen these days — has been accused multiple times of plagiarism, despite “stripping down” his extensive workload (books, speeches, columns, tweets). Yet he continues to host his own show and to write a column for The Washington Post in the midst of a growing number of unemployed journalists and dwindling number of outlets. Which is part of the problem. “What happens in the media is the cult of personality,” said Charles R. Eisendrath, director of the Livingston Awards and Knight-Wallace Fellowship, in the Times. “As long as it’s cheaper to brand individual personalities than to build staff and bolster their brand, they will do it.” Which is why Wolf, and even Abramson, are unlikely to be gone for good.

To be honest, we want them around. Media output hasn’t contracted along with the industry, so it’s easier to follow an individual than a sprawling media site, just like it’s easier to consult a YouTube beauty influencer than it is to browse an entire Sephora. With public intellectuals concealing the amount of work required of them, the pressure to live up to the myth we are all helping to maintain only increases, since the rest of us have given up on trying to keep pace with these superstars. They think better than we ever could, so why should we bother? Except that, like the human beings they are, they’re cutting corners and making errors and no longer have room to think the way they did when they first got noticed. It takes significant strength of character in this economy of nonstop (and precarious) work to bow out, but Ta-Nehisi Coates did when he stepped down last year from his columnist gig at The Atlantic, where he had worked long before he started writing books and comics. “I became the public face of the magazine in many ways and I don’t really want to be that,” he told The Washington Post. “I want to be a writer. I’m not a symbol of what The Atlantic wants to do or whatever.”

* * *

Of course a public intellectual saw this coming. In a 1968 discussion between Norman Mailer and Marshall McLuhan on identity in the technology age (which explains the rise in STEM-based public intellectuals), the latter said, “When you give people too much information, they resort to pattern recognition.” The individuals who have since become symbols of thought — from the right (Christina Hoff Sommers) to the left (Roxane Gay) — are overrepresented in the media, contravening the original definition of their role as outsiders who spur public action against the insiders. In a capitalist system that promotes branded individualism at the expense of collective action, the public intellectual becomes a myth of impossible aspiration that not even it can live up to, which is the point — to keep selling a dream that is easier to buy than to engage in reality. But an increasingly intelligent public is gaining ground.

The “Public Intellectual” entry in Urban Dictionary defines it as, “A professor who spends too much time on Twitter,” citing Peterson as an example. Ha? The entry is by OrinKerr, who may or may not be (I am leaning toward the former) a legal scholar who writes for the conservative Volokh Conspiracy blog. His bad joke is facetious, but not entirely inaccurate — there’s a shift afoot, from the traditional individual public intellectual toward a collective model. That includes online activists and writers like Mikki Kendall, who regularly leads discussions about feminism and race on Twitter; Bill McKibben, who cofounded 360.org, an online community of climate change activists; and YouTubers like Natalie Wynn, whose ContraPoints video essays respond to real questions from alt-right men. In both models, complex thought does not reside solely with the individual, but engages the community. This is a reversion to one of the early definitions of public intellectualism by philosopher Antonio Gramsci. “The traditional and vulgarized type of the intellectual is given by the man of letters, the philosopher, the artist,” he wrote in his Prison Notebooks — first published in 1971. “The mode of being of the new intellectual can no longer consist in eloquence, which is an exterior and momentary mover of feelings and passions, but in active participation in practical life, as constructor, organizer, ‘permanent persuader’ and not just a simple orator.” It doesn’t matter if you’re the smartest person in the room, as long as you can make it move.

* * *

Soraya Roberts is a culture columnist at Longreads.

There Is No Other Way To Say This

Tony Comiti / Getty, Illustration by Homestead

Melissa Batchelor Warnke | Longreads | May 2019 | 14 minutes (3,668 words)

 

“What you have heard is true. I was in his house.” So begins one of the most famous poems of the late twentieth century, Carolyn Forché’s “The Colonel,” which was part of an early body of work that seemed to contemporary admirers as if it had “reinvent[ed] the political lyric at a moment of profound depoliticization.” The poem describes a meeting Forché had with a Salvadoran military leader in his home in 1978, a year before the coup that sparked that country’s extraordinarily brutal civil war, which lasted for more than twelve years. The poem’s power lies in the quick juxtaposition of quotidian details — the colonel’s daughter filing her nails, a cop show playing on TV, mangoes being served — with his sudden sadistic flourish:

…………..The colonel returned with a sack used to bring groceries
home. He spilled many human ears on the table. They were like
dried peach halves. There is no other way to say this. He took one
of them in his hands, shook it in our faces, dropped it into a water
glass. It came alive there. I am tired of fooling around he said. As
for the rights of anyone, tell your people they can go fuck them-
selves. He swept the ears to the floor with his arm and held the last
of his wine in the air………..

“Something for your poetry, no?” the colonel says next. The implication is clear; the young human rights advocate’s writing is pointless, the colonel’s position will forever afford him impunity. Read more…

Technology Is as Biased as Its Makers

"Patty Ramge appears dejected as she looks at her Ford Pinto." Bettmann / Getty

Lizzie O’Shea | an excerpt adapted from Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Teach Us about Digital Technology | Verso | May 2019 | 30 minutes (8,211 words)

In the late spring of 1972, Lily Gray was driving her new Ford Pinto on a freeway in Los Angeles, and her thirteen-year-old neighbor, Richard Grimshaw, was in the passenger seat. The car stalled and was struck from behind at around 30 mph. The Pinto burst into flames, killing Gray and seriously injuring Grimshaw. He suffered permanent and disfiguring burns to his face and body, lost several fingers and required multiple surgeries.

Six years later, in Indiana, three teenaged girls died in a Ford Pinto that had been rammed from behind by a van. The body of the car reportedly collapsed “like an accordion,” trapping them inside. The fuel tank ruptured and ignited into a fireball.

Both incidents were the subject of legal proceedings, which now bookend the history of one of the greatest scandals in American consumer history. The claim, made in these cases and most famously in an exposé in Mother Jones by Mike Dowie in 1977, was that Ford had shown a callous recklessness for the lives of its customers. The weakness in the design of the Pinto — which made it susceptible to fuel leaks and hence fires — was known to the company. So too were the potential solutions to the problem. This included a number of possible design alterations, one of which was the insertion of a plastic buffer between the bumper and the fuel tank that would have cost around a dollar. For a variety of reasons, related to costs and the absence of rigorous safety regulations, Ford mass-produced the Pinto without the buffer.

Most galling, Dowie documented through internal memos how at one point the company prepared a cost-benefit analysis of the design process. Burn injuries and burn deaths were assigned a price ($67,000 and $200,000 respectively), and these prices were measured against the costs of implementing various options that could have improved the safety of the Pinto. It turned out to be a monumental miscalculation, but, that aside, the morality of this approach was what captured the public’s attention. “Ford knows the Pinto is a firetrap,” Dowie wrote, “yet it has paid out millions to settle damage suits out of court, and it is prepared to spend millions more lobbying against safety standards.” Read more…

Glass, Pie, Candle, Gun

Henry Griffin / AP

Sean Howe | Longreads | May 2019 |  15 minutes (3,853 words)

In November 2018, after the Secret Service seized the security credentials of CNN reporter Jim Acosta, the White House Press Secretary stated the reason for the revocation was that the administration would “never tolerate a reporter placing his hands on a young woman just trying to do her job as a White House intern.” Within hours, attorney Ted Boutrous responded on Twitter:
Read more…

Mothers are the Backbone of the Revolution

Lizeth Dávila, 39, holds a photo of her murdered son Álvaro, 15, in her hands. All photos by Jacky Muniello.

Alice Driver | Longreads | May 2019 | 7 minutes (1957 words)

She will tell the story of her child’s murder as many times as needed. She will tell it until her voice breaks, until her eyes no longer fill with tears, until her demands for justice are met. She could be the mother of Michael Brown in Ferguson, Missouri, or Alyssa Alhdeff in Parkland, Florida, or Álvaro Manuel Conrado Dávila in Managua, Nicaragua. The history of mothers as activists in the Americas is firmly rooted in the Mothers of the Plaza de Mayo in Argentina, a group of hundreds of mothers who marched weekly in front of the presidential palace in Buenos Aires to protest the murder and disappearance of their children under the military dictatorship that ruled the country from 1976 to 1983. These mothers, bound together by the private pain of witnessing a child’s murder or disappearance, turn their anguish and rage outward into public movements to demand justice, often at great risk to themselves.

Read more…

Mothering on the Borders

Illustration by Ellice Weaver

Yifat Susskind | Longreads | April 2019 | 17 minutes (4,193 words)

 
When my sons were younger, I remember explaining to them the difference between real and imaginary. Their dreams and nightmares weren’t real; you couldn’t see or touch them. The stories in their books weren’t real; I soothed their worries about monsters coming to life by assuring my boys it was all just imaginary.

Those conversations have surfaced in my mind as I’ve been thinking about borders; these made-up lines etched across the Earth by the powerful to hold their power in place — lines that are imaginary at first and then all too real.

Just look to the killing field that Israel has sown around Gaza, imprisoning people on a spit of land so ruined that it will soon be uninhabitable. It’s over one year since people there rose up to stage on-going protests against the occupation that has ruined lives and destroyed communities.

There’s also the US-Mexico border in Arizona, cutting across the land of the Indigenous Tohono O’odham People, now thick with the apparatus of state violence: cameras, fences, drones, guns, jails. Or the line that was drawn to divide Korea, now the world’s most militarized border, stuck with the Orwellian designation DMZ, for “demilitarized zone.”

As the director of MADRE, an international women’s rights organization, I’ve spent time recently at each of these borders, with feminist peace activists and Indigenous women leaders. In each place, I listened as women described what it’s like to be trapped by borders, as mothers told of their responsibility for the survival and peace of mind of their children in these zones of hostility and violence, loss and separation.

To see the world through the eyes of those who are responsible for its most vulnerable people: that’s what it means to work from the perspective of mothers. When we do this, we understand anew the issues that drive migration and border brutality — and the solutions needed to address them.

***

Read more…

The Man Who’s Going to Save Your Neighborhood Grocery Store

Illustration by Vinnie Neuberg

Joe Fassler | The Counter & Longreads | April 2019 | 8,802 words (33 minutes)

This story is published in partnership with The Counter, with reporting supported by the 11th Hour Food and Farming Fellowship at the University of California, Berkeley.  


In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.

“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”

Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chair of the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.

He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery is a classic red oceanindustry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industrys stodgy old playbook — “buy one, get onesales, coupons in the weekly circular is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.

Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man whos worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime carte blanche to build the working model hes long envisioned, one he believes can save the neighborhood supermarket from obscurity.

Kevin Kelley, illustration by Vinnie Neuberg

Rich Niemann, illustration by Vinnie Neuberg

The store that resulted is called Harvest Market, which opened in 2016. Its south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say its shaped like an ark, because its meant to survive an apocalypse.

Harvest Market is the anti-Amazon. Its designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold online is expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isnt asking grocers to be more like Jeff Bezos or Sam Walton. Hes not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined a place where farmers and their urban customers can meet, a crucial link between the city and the country.

But first, if theyre going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.

* * *

Kevin Kelley is an athletic-looking man in his mid-50s , with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.” That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.

You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.

A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like emotional opportunity,” “brain activity,” “climax,and “mise-en-scène.But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.

It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,Kelley tells me. When we get a new job, we sit around this table we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That its do-or-die time now.

Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.

I make my living convincing male skeptics of the power of emotions,he says.

Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have been the hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experience mass disruptions on a scale never before seen in human history. Drought will be epidemic. The ocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creating a new generation of climate refugees. And all because we didn’t move quickly enough when we still had time.

You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.

Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.

I make my living convincing male skeptics of the power of emotions.

In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”

It’s hard to believe now, but there was not a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.

The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americans dined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going on since at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called “channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure could rocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.

A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And though a spate of bankruptcies has recently hit the news, there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDA report. These independent stores sold 11 percent of the nation’s groceries in 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).

But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.

By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to a report called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The report describes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.

The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.

That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers now act as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime members receive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.

Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it wont work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, theyll fail. That prospect keeps Kelley up at night because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.

I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,he tells me. What gives me hope is the upstarts who will do the opposite. Who arent going to sell convenience or efficiency, but fidelity.

The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.

* * *

Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.

“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.” What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?

My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”

In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.

The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’s second-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley. “We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”

At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner. It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.

Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.

Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.

When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.

Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”

Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)

With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.

* * *

Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores are hugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea. A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”

In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.

Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”

A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?

The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.

You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.

Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)

“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”

Since then, Kelley has continued to build his case to unreceptive audiences of male executives with mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh, customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.

Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”

Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.

Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.

* * *

When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.

Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is “Harnessing the Power of Nice.”

Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”

What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”

Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation. During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.

Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.

Rich Niemann, the jaded supermarket elder statesman, broke down and wept.

* * *

I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, the most interesting store in America.

Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.

I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.

And then we stepped inside.

Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.

I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.

Then there are the informational placards that point out suppliers and methods.

There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.

That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.

“When,” Kelley remembers asking, “did you start to mistake opulence for success?”

In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.

The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.

“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.

As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.

I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.

“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.

He grinned at me.

“That’s not Illinois,” he said.

We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nights every week, plus special events for schools and other groups.

For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.

We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.

The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.

For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world, started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.

It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.

Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.

Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. Thats massive change, and its virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.

If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.

* * *

In 2017, just months after Harvest Market’s opening, Niemann won the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.

Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.

“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”

They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.

But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.

Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.

But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.

This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.

If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.

I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.

Which one is right?

I guess it depends on how you feel about the movies.

Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.

Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.

* * *

Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor of Light the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic

Editor: Michelle Weber
Fact checker: Matt Giles
Copy editor: Jacob Z. Gross