Category Archives: Featured

If Clean Food Is for Everyone, Why Are Its Gurus All Young, Pretty Women?

Our notions of health and wellness (both charged terms these days, one might add) are still stuck in a paradigm that wouldn’t be out of place in ancient Greece; what goes on inside us must somehow be visible and recognizable on our bodies’ surface. In her Guardian essay on the rise of orthorexia — the obsession with consuming pure, “perfect” foods — Bee Wilson traces the history of a recent-yet-oh-so-familiar publishing trend: using youthful, traditionally good-looking women to sell both specific products (hello, coconut-and-oat energy balls!) and an amorphous, ever-shifting “clean” lifestyle.

Every wellness guru worth her Himalayan pink salt has a story of how changing what you eat can change your life. “Food has the power to make or break you,” wrote Amelia Freer in her 2014 bestseller Eat. Nourish. Glow. (which has sold more than 200,000 copies). Freer was leading a busy life as a personal assistant to the Prince of Wales when she realised that her tummy “looked and felt as if it had a football in it” from too many snatched dinners of cheese on toast or “factory-made food”. By giving up “processed” and convenience foods (“margarine, yuck!”) along with gluten and sugar, Freer claimed to have found the secrets to “looking younger and feeling healthier”.

Perhaps the best-known diet-transformation story of all is that of Ella Mills — possessor of more than a million Instagram followers. In 2011, Mills was diagnosed with postural tachycardia syndrome, a condition characterised by dizziness and extreme fatigue. Mills began blogging about food after discovering that her symptoms radically improved when she swapped her sugar-laden diet for “plant-based, natural foods.” Mills — who used to be a model — made following a “free-from” diet seem not drab or deprived, but deeply aspirational. By the time her first book appeared in January 2015, her vast following on social media helped her to sell 32,000 copies in the first week alone.

There was something paradoxical about the way these books were marketed. What they were selling purported to be an alternative to a sordidly commercial food industry. “If it’s got a barcode or a ‘promise’, don’t buy it,” wrote Freer. Yet clean eating is itself a wildly profitable commercial enterprise, promoted using photogenic young bloggers on a multi-billion-dollar tech platform. Literary agent Zoe Ross tells me that around 2015 she began to notice that “the market was scouring Instagram for copycat acts — specifically very pretty, very young girls pushing curated food and lifestyle.”

Read the story

Why the Most Beautiful Poems Defy Understanding

At The Walrus, Matthew Zapruder examines his relationships with poetry and with his father. Despite being two men with great facility for precise language, they were unable to use it to bridge the distance between them. In likening poems to people, Zapruder says that the most beautiful thing about the poems most important to him is that their meaning cannot fully be articulated.

I have found that the poems which have meant the most to me, to which I return again and again, retain a central unsayability, a place where the drama of truly looking for something essential that can never quite be reached is expressed. Somewhere in the poem, or at its end, knowingness stops. You can feel the intelligence in the poem truly exploring, clambering along the words and down the page, and also that intelligence stopping at what cannot be known. Those moments where a limit is reached can often be the greatest, and most honest, in poetry. They can come first as a surprise, then immediately afterward feel inevitable, at least for a little while.

This is why asking for a certain kind of knowledge—that way of knowing we automatically, and justifiably, expect from other texts, anything other than a poem—limits our experience with poetry. If we imagine a poem as something to be answered or solved, we will most likely find ways to do so. But I think we would be better off to think of “understanding” in a poem as an ongoing process of attention.

Simone Weil writes that attention is the purest form of generosity. A generous, open, genuinely focused attention moves us through the poem, just as it moves us through an experience, through a friendship, through anything else that means and keeps on meaning. If a poem is really good, you can’t really say what it’s “about,” that is, what its central “message” is, any more than you can do so for a painting or a piece of music or a person or a mountain.

A poem is like a person. The more you know someone, the more you realize there is always something more to know and understand. A final understanding could probably only begin upon permanent separation, or death. This is why we come back to certain poems, as we do to places or people, to experience and re-experience, to see ourselves for who we truly are, and to continue to be changed.

Read the story

Corals and Crabs Get Moonstruck, Too

The moon has been on my mind lately. Maybe it’s the upcoming solar eclipse (of which I’ll only get to see 88% percent, alas), or the number of times “lunatic” has been used in political commentary over the past few months. Of course, if you’re a coral reef off the coast of Australia, the moon has always been a crucial element in your existence (specifically: your sex life), and humans’ heliocentric obsessions are just plain silly. As Ferris Jabr lovingly shows at Hakai Magazine, moonlight has only recently started to receive the attention it deserves from marine biologists and other environmental scientists — and their lateness is part of a broader, sun-versus-moon cultural binary that has perpetuated itself through the centuries.

In antiquity, the influence of the moon on earthbound life was intuited—and celebrated. Our ancestors revered the moon as the equal of the sun, a dynamic signature of time, and a potent source of fertility.

“Time was first reckoned by lunations, and every important ceremony took place at a certain phase of the moon,” wrote English classicist Robert Graves in The Greek Myths. A 25,000-year-old limestone carving discovered in a rock shelter in France depicts a pregnant woman holding what appears to be a bison horn with the swoop of a crescent moon and 13 small notches—a possible paean to reproductive and lunar cycles. And some early Meso-American cultures seemed to believe that the moon deity controlled sexuality, growth, rainfall, and the ripening of crops.

In more recent times, the importance of the moon to Earth’s creatures has been eclipsed by the great solar engine of life. The sun is searingly bright, palpably hot, bold, and unmissable; our steadfast companion for many of our waking hours. The moon is spectral and elusive; we typically catch it in glimpses, in partial profile, a smudge of white in the dark or a glinting parenthesis. Sunlight bakes the soil, bends the heads of flowers, pulls water from the seas. Moonlight seems to simply descend, deigning to visit us for the evening. We still perceive the sun as the great provider—the furnace of photosynthesis—but the moon has become more like mood lighting for the mystical and occult; more a symbol of the spirit world than of our own. “There is something haunting in the light of the moon; it has all the dispassionateness of a disembodied soul, and something of its inconceivable mystery,” wrote Joseph Conrad in Lord Jim. The sun’s immense power over Earth and its creatures is scientific fact; to endow the moon with equal power is to embrace fairy tales and ghost stories.

Read the story

‘You Start Hiring Job-Quitters’

When everyone is encouraged to think of herself as a business, working for anyone else can only ever be considered training ground.

As companies have divested themselves of long-term obligations to workers (read: pensions, benefits, paths to advancement), employees (read: job-seekers) have developed an in-kind taste for short-term, commitment-free work arrangements. Their aim in landing any given job has since become landing another job elsewhere, using the job as an opportunity to develop transferable skills — and then to go ahead and transfer. The appeal of any given job becomes how lucrative it will be to quit.

At Aeon, Ilana Gershon describes how this calculus of quitting changes workplace dynamics, management techniques, division of labor, and the nature of being co-workers. “After all,” Gershon writes, “everyone works in the quitting economy, and everyone knows it.”

If you are a white-collar worker, it is simply rational to view yourself first and foremost as a job quitter – someone who takes a job for a certain amount of time when the best outcome is that you quit for another job (and the worst is that you get laid off). So how does work change when everyone is trying to become a quitter? First of all, in the society of perpetual job searches, different criteria make a job good or not. Good jobs used to be ones with a good salary, benefits, location, hours, boss, co-workers, and a clear path towards promotion. Now, a good job is one that prepares you for your next job, almost always with another company.

Your job might be a space to learn skills that you can use in the future. Or, it might be a job with a company that has a good-enough reputation that other companies are keen to hire away its employees. On the other hand, it isn’t as good a job if everything you learn there is too specific to that company, if you aren’t learning easily transferrable skills. It isn’t a good job if it enmeshes you in local regulatory schemes and keeps you tied to a particular location. And it isn’t a good job if you have to work such long hours that you never have time to look for the next job. In short, a job becomes a good job if it will lead to another job, likely with another company or organisation. You start choosing a job for how good it will be for you to quit it.

Read the story

An Ode to Dishwashers, the Unsung Heroes of the Restaurant Kitchen

In the Washington Post, food critic Tom Sietsema signed up for a dishwashing shift at Caracol, a 250-seat Mexican restaurant in Houston to experience the job Anthony Bourdain said taught him “every important lesson of my life.”

Dishwashers get paid a median annual wage of $20,000 a year in the U.S. and are a critical component of the restaurant industry. As Emeril Lagasse puts it, “You can’t have a successful service in a restaurant without a great dishwasher.” More restaurants are finding ways to recognize and reward their dishwashers:

After years of performing tasks no one else wants to do — cleaning nasty messes, taking out trash, polishing Japanese wine glasses priced at $66 a stem (at Quince in San Francisco) — the unsung heroes of the kitchen might be finally getting their due.

This spring, chef Rene Redzepi of the world-renowned Noma in Copenhagen made headlines when he made his dishwasher, Ali Sonko, a partner in his business. The Gambian native helped Redzepi open the landmark restaurant in 2003. And in July, workers at the esteemed French Laundry in Yountville, Calif., one of master chef Thomas Keller’s 12 U.S. restaurants and bakeries, voted to give their most prestigious company honor, the Core Award, to a dishwasher: Jaimie Portillo, who says he has never missed a day of work in seven years.

Read the story

The Other National Pastime: Unusual Baby Names

Choosing a name for your baby is a culturally fraught decision. So much is at stake: will it invite bullying? Does it correctly channel the parents’ attitude toward the cultural zeitgeist? Is it optimized for relatability and uniqueness? In the New Yorker, Lauren Collins shares the story behind her second child’s name, a boy whose mixed Franco-American heritage added several layers of complexity to the process (who knew that a Kevin could never be taken seriously in Paris?). She also looks at the broader context of naming conventions in the U.S. — yet another realm in which American exceptionalism has played out in bizarre and unexpected ways.

In the U.S., as the law professor Carlton F. W. Larson has written, the selection of a child’s name falls within “a legal universe that has scarcely been mapped, full of strange lacunae, spotty statutory provisions, and patchy, inconsistent case law.” Generally, you can’t use a pictograph, an ideogram, a number, an obscenity, or a name that is excessively long, but the regulations vary wildly from state to state and are often the domain of randomly applied “desk-clerk law.” It’s unclear whether you can call your son Warren Edward Buffett, Jr., when you have not actually procreated with Warren Edward Buffett. There are stricter and clearer criteria for naming dogs and horses than there are for naming people. (The American Kennel Club prohibits, among other things, the words “champ,” “champion,” “sieger,” “male,” “stud,” “sire,” “bitch,” “dam,” and “female,” while the Jockey Club recently went to court to block the registration of a filly named Sally Hemings, which has since been rebaptized Awaiting Justice.) Some of the rules have more to do with keyboards than with child protection. In California, amazingly, you can be Adolf Hitler Smith, but not José Smith, because of a ban on diacritics.

The exuberance of American names has been one of the country’s hallmarks since its founding. In sixteenth-century England, the Puritans started using their children’s birth certificates as miniature sermons. They produced some doozies: Humiliation Hynde, Kill-sin Pimple, Praise-God Barebone (whose son, If-Christ-had-not-died-for-thee-thou-hadst-been-damned Barebone, eventually went by Nicholas Barbon). Charles II largely stamped out the trend during the Reformation, but the Puritans continued the practice in the New World. The Claps—a Roger and Johanna who immigrated to Dorchester in 1630—produced a virtue-themed progeny that included Experience, Waitstill, Preserved, Hopestill, Wait, Thanks, Desire, Unite, and Supply, making them perhaps the Kardashians of Colonial Massachusetts.

Read the story

The Brief Career and Self-Imposed Exile of Jutta Hipp, Jazz Pianist

Aaron Gilbreath | This Is: Essays on Jazz | Outpost19 | August 2017 | 21 minutes (5,900 words)

In 1960, four years after the venerable Blue Note Records signed pianist Jutta Hipp to their label, she stopped performing music entirely. Back in her native Germany, Hipp’s swinging, percussive style had earned her the title of Europe’s First Lady of Jazz. When she’d moved to New York in 1955, she started working at a garment factory in Queens to supplement her recording and performing income. She played clubs around the City. She toured. Then, with six albums to her name and no official explanation, she quit. She never performed publicly again, and she told so few people about her life in music that most of her factory coworkers and friends only discovered it from her obituary. For the next forty-one years, Jutta patched garments for a living, painted, drew and took photos for pleasure, all while royalties accrued on Blue Note’s books.

Read more…

Cory Taylor Answers Your Questions About Dying

Celebrated Australian novelist Cory Taylor was diagnosed with cancer in 2005. Rejecting the taboos that prevent humans from talking openly about death, she goes on the record with her answers to some of the most typical questions people have asked her about dying. In her piece at The New Yorker, she talks about her regrets, fears, priorities, what she’ll miss most, and how she’d like to be remembered. Taylor passed away on July 5th, 2016. Her book, Dying: A Memoir was published in the United States on August 1st, 2017.

A few months back, I was invited to take part in a program for ABC television called “You Can’t Ask That.” The premise of the show is that there are taboo subjects about which it is difficult to have an open and honest conversation, death being one of them. The producer of the program explained that I would be required to answer a number of questions on camera. She said questions had been sent in from all over the country, and the ten most common had been selected. I wasn’t to know what these were until the day I went into the studio for the filming.

It turned out that the producer of the program herself had a need to talk about death, as she had recently lost her father to cancer, and was struggling to cope. This is so often the case with people I talk to about my situation: they listen for a while, then they tell me their own death story, but always with a vague sense that it is shameful, that the whole sorry business is somehow their fault. In taking part in “You Can’t Ask That,” I wanted to do my bit to change things around, to win back some dignity for the dying, because I don’t think silence serves the interests of any of us.

The questions, as it turned out, were unsurprising. Did I have a bucket list, had I considered suicide, had I become religious, was I scared, was there anything good about dying, did I have any regrets, did I believe in an afterlife, had I changed my priorities in life, was I unhappy or depressed, was I likely to take more risks given that I was dying anyway, what would I miss the most, how would I like to be remembered? These were the same questions I’d been asking myself ever since I was diagnosed with cancer, back in 2005. And my answers haven’t changed since then. They are as follows.

Read the story

Miles to Go Before You Sleep

a very dark cave, with some light shining in from a far-off entrance

A new Discovery Channel show, Darkness, sends three strangers into a cave or abandoned mineshaft, giving them six days to find each other and a way out — with no light, at all, at any time. For Esquire, Patrick Blanchfield takes a deep look at the premise, the participants, and the crew, who also have to spend the week in the dark. Leaving aside the cold, and the hallucinations, and the high potential for physical injury, there’s the issue of sleep: how do you sleep normally with no light or social cues? You don’t.

Brandon’s experience gets at another challenge of surviving underground, in the dark or otherwise: what happens to your sense of time. Brandon fell asleep twice, and only for thirty or forty minutes at a go. But when he awoke, he was certain that he’d been asleep for two eight-to-ten-hour stretches. When the safety crew came to retrieve him, Brandon was adamant he’d been underground for two full days. In reality, he’d only been below for twelve hours.

Scientists have documented this phenomenon extensively. Researchers who have undertaken simultaneous but separate sojourns into caves for extended periods will emerge with radically different estimates of how long they’ve been below—different from one another by weeks, and different from the calendar by yet more. Absent cues from the aboveground natural world or data from clocks or phones, our conscious perception of time can get weird, fast.

But that’s nothing compared to what goes on inside our bodies. When people talk about your “circadian rhythm,” they’re actually referring to dozens of different physiological processes, cycles governing everything from your heart rate to your breathing to your immune system to your digestion to your body temperature. These sub-systems operate on their own timelines, but are largely kept in sync with each other as long as the body follows a roughly 24-hour cycle that tracks changes in ambient light and various social cues. In situations of irregular light and darkness, everything goes out of whack within a couple of days. It is not uncommon for test subjects living underground to start sleeping and waking in forty-eight-hour cycles, or to experience bizarre changes in their behavior or sense of self. Michel Siffre, a European scientist, spent months at a time in half-lit caves in the Alps and Texas as part of research he carried out for NASA. Siffre not only got hypothermia, but also went off the rails, in one instance desperately trying to befriend a mouse for companionship but instead accidentally crushing it and falling into near-suicidal despair. When asked about the impact of those experiments on his mind and body, Siffre, who’s now in his seventies, describes it as “hell” and speaks of feeling like “a semi-detached marionette.”

Read the story

‘Everyone is Guilty All the Time’

Noura Jackson spent nine years in prison after being convicted of murdering her mother, despite a complete lack of physical proof — and other evidence that could have been used to support her claim of innocence was withheld by prosecutor Amy Weirich. This isn’t the first time Weirich has been found to have withheld evidence. And according to other lawyers who spoke with Emily Bazelon, whose impressively deep dive into the case appears in The New York Times Magazine, the convict-or-else attitude that drives prosecutorial misconduct is alive and well in Weirich’s office.

Weirich is now the district attorney, overseeing all prosecutions in Shelby County, Tennessee.

When Amy Weirich learned to try cases in Shelby County in the 1990s, her office had a tradition called the Hammer Award: a commendation with a picture of a hammer, which supervisors or section chiefs typically taped on the office door of trial prosecutors who won big convictions or long sentences. When Weirich became the district attorney six years ago, she continued the Hammer Awards. I spoke to several former Shelby County prosecutors who told me that the reward structure fostered a win-at-all-costs mind-set, fueled by the belief that ‘‘everyone is guilty all the time,’’ as one put it. ‘‘The measure of your worth came down to the number of cases you tried and the outcomes,’’ another said. (They asked me not to use their names because they still work as lawyers in Memphis.) One year, the second former prosecutor told me, he dismissed the charges in multiple murder cases. ‘‘The evidence just didn’t support a conviction,’’ he said. ‘‘‘But no, I didn’t get credit from leadership. In fact, it hurt me. Doing your prosecutorial duty in that office is not considered helpful.’’ Weirich disagrees, saying ‘‘Every assistant is told to do the right thing every day for the right reasons.’’

Read the story