Search Results for: The Guardian

The Religion No One Talks About: My Search For Answers in an Old Caribbean Faith

Illustration by Missy Chimovitz

Sarah Betancourt | Longreads | March 2018 | 23 minutes (5,704 words)

 

There are things in life a Puerto Rican doesn’t talk about. One is the mesa blanca, or white table, in the laundry room, with statues of St. Michael, St. Lazarus, and others whose names you might not know. For years, I assumed leaving coffee in front of those other statues, trading out stale bread with new, and listening to nine days of prayers (la novena) after a death was just normal American life. Catholicism was for Sundays; Espiritismo was the rest of the time. By the time I was 9, I realized there was a reason my parents locked the laundry room door when white people came to our house.

***

The last thing I packed when I left Manhattan for Florida on September 12, 2015, was an old plastic rosary, worn and smelling of incense embedded in the yellowing nylon between each of the 60 beads. Seven hours later, I changed into a pink t-shirt in a dingy airport stall. My abuela loved pink. Twenty minutes after that, I was standing in front of a hospice, hating how bright the sunlight was, wishing away the flowers.

I didn’t recognize her on the bed until I saw the familiar grey blue of her eyes. I was hoping that in her mind, she was on a beach somewhere, maybe dipping her feet into the sands by her hometown in Puerto Rico, not here, in this bed, in this 50-pound body. My godfather puffed up his chest and said, “She’s been traveling this week. Seeing people.”

She should have been dead days earlier. Everyone said, “She waited for you. She needs to speak with you.” Her last words (“estoy cansada,” “I’m tired”) were spoken a week before. Alone in the room, I pulled over a chair, and touched her arms. She lay completely still, her drifting right eye trying to focus. I dipped a Q-tip in water to wet her hard tongue, brushed her hair as it fell like snowflakes on my hands, pulled out my Chapstick to give her lips relief. No reaction.

Catholicism was for Sundays; Espiritismo was the rest of the time.

I had forgotten that her solace couldn’t be found in the physical. Santa Betancourt had been a spiritual woman for every single one of her 94 years. As a trained healer in the faith of Espiritismo, she had people asking her to fix them, to solve their problems. Every time I saw her, I would greet her with un beso (a kiss) and “la bendicion,” not knowing for many years that it was more than a phrase of recognition, but a request for her blessing. I had never seen her ask anyone but God to heal her own pains. She hated going to the doctor.

I pulled out the tiny blue book she had given me, hoping that the complex religious words would make some sense. I placed the rosary in her hand and asked her if she wanted me to pray. I mentioned it wouldn’t be great — I had been agnostic for 10 years, and didn’t know what to believe. Her eye stopped swimming, and her finger moved. I pulled up the rosary on my phone, lay my head next to hers, and began.

Read more…

David Chang’s ‘Ugly Delicious’ Pushes Food TV in the Right Direction

David Chang with South Philly Barbacoa's Cristina Martinez in 'Ugly Delicious.'

There’s no denying that David Chang’s new Netflix docuseries, “Ugly Delicious”, is aesthetically gorgeous. The show’s underlying concept—”ugly” food like tacos, barbecue, and fried rice all have intrinsic values that surpass its creation born out of necessity and a lowly legacy—is a sui generis angle for a well-worn genre that has long shifted to food porn rather that pursuing and examining the cultural and geopolitical value that food possesses.

In a recent interview with Grub Street, “Top Chef” judge and chef Tom Colicchio mentioned the rise of “unfussy” food on the program’s 15th season: “The chefs were doing more, I wouldn’t say rustic, but a much more conventional style of food.” Translation: This shift isn’t occurring in a vacuum.

As the New Yorker‘s Helen Rosner explains in her review of the eight-part series, “What makes “Ugly Delicious” compelling, ultimately, is Chang’s commitment to rejecting purity and piety within food culture…In food culture, particularly American food culture, the concept of authenticity is wielded like a hammer…[and] the problem with such rigid categorizations, according to “Ugly Delicious,” is, for one thing, creative stagnation.”

This certainly makes for a thoroughly interesting viewing experience; before I realized it, I had binge-watched four episodes. This sort of programming is also refreshing—Chang has subverted a genre. For a generation that has been bred on the gluttony of glossy networks and competitive cooking, “Ugly Delicious” throws up a middle finger, and instead asked questions that are relevant to how we should be thinking about food (and not just consuming for its sheer shock value). Read more…

Hoffnung um jeden Preis

Illustration by Xenia Latii

Lindsay Gellman | Longreadsmärz 2018 | 23 Minuten (5,717 wörter)

Read the story in English

Kurz nachdem Kate Colgans Mutter, Janet, im vergangenen Sommer in einem Krankenhaus in der Nähe von Manchester, Großbritannien, aus der Narkose aufwachte, hatte sie eine einfache Bitte: “Bring mich nach Deutschland.”

Also hat Kate, 25, die Familien-Limousine mit einem Dachträger ausgestattet und mit Gepäck beladen. Sie verfügte die Entlassung ihrer Mutter aus dem Krankenhaus gegen ärztliche Anordnung und hob sie vorsichtig vom Rollstuhl auf den Beifahrersitz. Kates damaliger Verlobter Chad fuhr sie dann zusammen mit der kleinen Tochter des Paares 16 Stunden am Stück in eine Privatklinik am Rande von Dornstetten, einer ruhigen mittelalterlichen Stadt zwischen Stuttgart und Freiburg.

Bei Janet wurde im September 2016 metastasierender Magenkrebs diagnostiziert. Ärzte des National Health Service gaben ihr höchstens ein Jahr zu leben und boten nur eine palliative Chemotherapie an.

Eine palliative Therapie zu wählen erschien Kate wie das Eingeständnis eines Aufgebens. Sie durchsuchte das Internet nach anderen Möglichkeiten, und stieß auf die Hallwang Private Onkologische Klinik, eine Einrichtung die außerhalb des streng regulierten deutschen Krankenhauswesens operiert. Die Hallwang Klinik hat sich in den letzten Jahren inmitten einer Schar von Krebskliniken, die in Deutschland Fuß gefasst haben, profiliert, und vermarktet sich als eine Art Luxus-Spa mit maßgeschneiderten Behandlungen, einer idyllischen Lage im Schwarzwald, und delikaten Mahlzeiten, die in einem Esszimmer eingenommen werden.

Die Online-Testimonials der Klinik sahen vielversprechend aus, und so erkundigten sich die Colgans nach der Behandlung. Nach Durchsicht von Janets Krankenakte sagte ein Arzt der Hallwang-Klinik den Colgans, dass mit Hilfe eines experimentellen Medikamenten-Cocktails, der anderswo nicht ohne weiteres zu haben sei, Janet eine Remission ihrer Krankheit erreichen könne. Aber der Preis sei enorm: mehr als 100.000 Euro. Die Klinik rechnet nicht über Krankenversicherungen ab und verlangt in der Regel eine Anzahlung von 80 Prozent, bevor mit der Behandlung begonnen wird.

Eine Chance auf Remission schien einen Versuch wert zu sein — um jeden Preis.

Read more…

The Last Resort

Illustration by Xenia Latii

Lindsay Gellman | LongreadsMarch 2018 | 23 minutes (5,754 words)

Read the story in German

Soon after Kate Colgan’s mother, Janet, awoke from surgery in a hospital near Manchester, U.K., last summer, she made a simple request of her daughter: “Get me to Germany.”

So Kate, then 25, fitted the family sedan with a roof rack and piled it with luggage. She arranged for her mother’s voluntary discharge from the hospital, against doctors’ wishes, and eased her from a wheelchair into the car’s passenger seat. Kate’s then-fiancé Chad drove them, along with the couple’s infant daughter, some 16 hours straight to a private treatment clinic on the outskirts of Dornstetten, a quiet medieval town in southern Germany.

Janet was diagnosed with metastatic stomach cancer in September 2016, when she was 54 years old. British doctors with the National Health Service gave her up to a year to live and offered only palliative care with chemotherapy.

Choosing palliative care felt to Kate like giving up. She scoured the web for other options for her mother, and came across the Hallwang Private Oncology Clinic, a for-profit institution that operates outside of the strictly regulated German hospital system. The Hallwang Clinic has emerged in recent years as the highest profile of a bevy of cancer clinics to gain traction in Germany. It markets itself as a luxury spa of sorts, touting its individualized treatments, pastoral setting in southern Germany’s Black Forest, and delicately plated dining-room meals.

The clinic’s online testimonials looked promising, so the Colgans inquired about treatment. After reviewing Janet’s medical records, a Hallwang Clinic doctor told the Colgans a cocktail of experimental drugs not widely available elsewhere could mean eventual remission for Janet. But the price would be staggering — more than $120,000. The clinic does not accept insurance and typically requires an 80% deposit before treatment can begin.

A chance at remission seemed worth a try — at any cost.

Read more…

Seeking a Roadmap for the New American Middle Class

The next American middle class
Illustration by Zoë van Dijk

Livia Gershon | Longreads | March 2018 | 8 minutes (1,950 words)

Over the past few months, Starbucks, CVS, and Walmart announced higher wages and a range of other benefits like paid parental leave and stock options. Despite what the brands say in their press releases, the changes probably had little to do with the Republican corporate tax cuts, but they do reflect a broader economic prosperity, complete with a tightening a labor market. In the past couple of years, real wages hit their highest levels ever, and even the lowest-paid workers started getting raises. As Matt Yglesias wrote at Vox, “for the first time in a long time, the underlying labor market is really healthy.”

But it doesn’t feel that way, does it? From the new college graduate facing an unstable contract job and mounds of debt to the 30-year-old in Detroit picking up an extra shift delivering pizzas this weekend, it just seems like we’re missing something we used to have.

In a 2016 Conference Board survey, only 50.8 percent of U.S. workers said they were satisfied with their jobs, compared with 61 percent in 1987 when the survey was first done. In fact, job satisfaction hasn’t come close to that first reading in this century. We’re also more anxious and depressed today than we’ve been since the depths of the recession, and we’re dying younger — particularly if we’re poor.

So maybe this is a good moment to stop and think about what really good economic news would look like for American workers. Imagine for a moment that everything goes right. The long, slow recovery from the Great Recession continues, rather than reversing itself and plunging us back into high unemployment. Increased automation doesn’t displace a million truck drivers but creates new, more skilled driving jobs. The retirement of the Baby Boomers reduces labor supply, driving up wages at nursing homes, call centers, and the rest of the gigantic portion of the economy where pay is low.

Would this restore dignity to work and a sense of optimism to the nation? Would it bring back the kind of pride we associate with the 1950s GM line worker?

I don’t think it would. I think it would take far more fundamental changes to win justice for American workers. But I also think it’s possible to strive for something way better than the postwar era we often remember as a Golden Age for workers.

Let’s start by dispelling the idea that postwar advances for American workers were some kind of natural inevitability that could never be replicated today. Yes, in the 1940s, the United States was in a commanding position of economic dominance over potential rivals decimated by war. And yes, companies were able to translate the manufacturing capacity and technological know-how built up through the military into astounding new bounty for consumers. But, when it comes to profitability, business has also had plenty of boom times in recent decades, with no parallel advances for workers.

This is the moment to stop and think about what really good economic news would look like for American workers.

Let’s also set aside the nostalgia about how we used to make shit in this country. Page through Working, Studs Terkel’s classic 1972 book of interviews with a broad range of workers, and factories come across as a kind of hellscape. A spot welder at a Ford plant in Chicago describes standing in one place all day, with constant noise too loud to yell over, suffering frequent burns and blood poisoning from a broken drill, at risk of being fired if he leaves the line to use the bathroom. “Repetition is such that, if you were to think about the job itself, you’d slowly go out of your mind,” he told Terkel.

The stable, routine corporate office work that also thrived in the postwar era certainly wasn’t as unpleasant as that, but there’s a whole world of cultural figures, from Willy Loman to Michael Scott, that suggest it was never an inherent font of meaning.

The fact that the Golden Age brought greater wealth, pride, and status to American workers, both blue- and white-collar, wasn’t really about the booming economy or the nature of the work. It was a result of power politics and deliberate decisions. In the 1930s and ‘40s, unionized workers, having spent decades battling for power on the job, at severe risk to life and livelihood, were a powerful force. And CEOs of massive corporations like General Motors were scared enough of radical workers, and hopeful enough about the prospects of shared prosperity, to strike some deals.

A consensus about how jobs ought to work emerged from these years. Employers would provide decent pay, health insurance, and pensions for large swaths of the country’s workers. The federal government would build a legal framework to address labor disputes and keep corporate monopolies from getting out of control. Politicians from both parties would march in the Labor Day parade every year, and workers would get their fair share of the new American prosperity.

Today, of course, the postwar consensus has broken down. Even if average workers are making more money than we used to, the gap between average and super-rich makes us feel like we’re getting nowhere. We may be able to afford iPhones and big-screen TVs, but we’ve got minimal chances of getting our kids into the elite colleges that define the narrow road to success.

And elite shows of respect for workers ring more and more hollow. Unions, having drastically declined in membership, no longer have a seat at some of the tables they used to. Politicians celebrate businesses’ creation of jobs, not workers’ accomplishment of necessary and useful labor. A lot of today’s masters of industry clearly believe that workers are an afterthought, since robots will soon be able to do anyone’s jobs except theirs.

But let’s not get too nostalgic about the Golden Age. As many readers who are not white men may be shouting at me by this point, there was another side to these mid-century ideas about work. The entire ideological framework defining a job with dignity was inextricably tied up with race and gender.

From the start of the industrial revolution, employers used racism to divide workers. And union calls for respect and higher wages were often inseparable from demands that companies hire only white men. The Golden Age didn’t just provide white, male workers with higher wages than everyone else but also what W.E.B. Du Bois called the “public and psychological wage” of a sense of racial superiority.

Just as importantly, white men in the boom years also won stay-at-home wives. With rising male wages, many white women — and a much smaller number of women of other races — could now focus all their energy on caring for home and family. For the women, that meant escape from working at a mill or cooking meals and doing laundry for strangers. But it also meant greater economic dependence on their husbands. For the men, it was another boost to their living standard and status.

Golden Age corporate policies, union priorities, and laws didn’t create the ideal of the white, breadwinner-headed family, but they did reinforce it. Social Security offered benefits to workers and their dependents rather than to all citizens, and excluded agricultural and domestic workers, who were disproportionately black. The GI Bill helped black men far less than white ones and left out most women except to the extent that their husbands’ benefits trickled down to them.

Let’s also set aside the nostalgia about how we used to make shit in this country.

Today, aside from growing income inequality, unstable jobs, and the ever-skyward climb of housing and education costs, a part of the pain white, male workers are feeling is the loss of their unquestioned sense of superiority.

So, can we imagine a future Golden Age? Is there a way to make working for Starbucks fulfill all of us the way we remember line work at GM fulfilling white men? Maybe. With an incredible force of political will, it might be possible to rejigger the economy so that modern jobs keep getting better. It would start with attacking income inequality head-on. The government could bust up monopolistic tech giants, encourage profit-sharing, and maybe even take a step toward redistributing inherited wealth. We’d also need massive social change to ensure people of color and women equal access to the good new jobs, and men and white people would need to learn to live with a loss of the particular psychological wages of masculinity and whiteness.

But even all that would still fail to address one thing that made work in the Golden Age fulfilling for men: the wives. Stay-at-home moms of the mid-twentieth century weren’t just a handy status symbol for their men. They were household managers and caregivers, shouldering the vast majority of child-raising labor and creating a space where male workers could rest and be served. And supporting a family was a key ingredient that made otherwise draining, demeaning jobs into a source of meaning.

Few men or women see a return to that ideal as a good idea today. But try imagining what good, full-time work for everyone looks like without it. Feminist scholar Nancy Fraser describes that vision as the Universal Breadwinner model — well-paid jobs, with all the pride and status that come with them, for all men and women. She notes that it would take massive spending to outsource childcare and other traditionally unpaid “female” work — particularly since those jobs would need to be good jobs too. It would also leave out people with personal responsibilities that they couldn’t, or wouldn’t, hand over to strangers, as well as many with serious disabilities. And it certainly wouldn’t solve the problem many mothers and fathers report today of having too little time to spend with family.

A really universal solution to the problem of bad jobs would have to go beyond “good jobs” in the Golden Age model. It would be a world where we can take pride in our well-paid jobs at Starbucks without making them the center of our identities. That could mean many more part-time jobs with flexible hours, good pay, and room for advancement. It could mean decoupling benefits like health care and retirement earnings from employment and providing a hefty child allowance. Certainly, it would mean a social and psychological transformation that lets both men and women see caring work, and other things outside paid employment, as fully as valuable and meaningful as a job.

As a bonus, this kind of solution would also make sense when we do fall back into recession, or if the robots do finally come for a big chunk of our jobs.

All this might sound absurdly utopian. We are, after all, living in a world where celebrity business leaders claim to work 80-plus hour weeks while politicians enthusiastically deny health care to people who can’t work.

But the postwar economy didn’t happen on its own. It was the product of a brutal, decades-long fight led by workers with an inspiring, flawed vision. And today, despite everything, new possibilities are emerging. Single-payer health care is a popular idea, and “socialism” has rapidly swung from a slur to a legitimate part of the political spectrum. Self-help books like The 4-Hour Work Week — which posit the possibility of a radically different work-life balance, albeit based on individual moxie rather than social change — have become a popular genre. Young, black organizers in cities across the country are developing their own cooperative economic models. And if there’s any positive lesson we can take from the current political moment, it’s that you never know what could happen in America. Maybe a new Golden Age is possible. It’s at least worth taking some time to think about how we would want it to look.

***

Livia Gershon is a freelance journalist based in New Hampshire. She has written for the Guardian, the Boston Globe, HuffPost, Aeon and other places.

 

To Be a Lexicographer Is to Surrender to Folly

Image by GCShutter (via Getty Images)

The arrival in the past 30 years of search engines and vast databases of electronic texts has made dictionaries far more comprehensive, but also much more complicated to compile and update (not that the task was easy to begin with). Andrew Dickson’s Guardian piece on the history of the Oxford English Dictionary focuses on the tension between the cumulative, decades-long process of updating the OED and a world in which few people still pay for hard copies and Google reaps most of the ad revenue from online queries. What is it that keeps this endeavor chugging along? There’s institutional inertia at work, no doubt, but also something even more amorphous: the vocational resistance — one might call it love? — of lexicographers who have embarked on a journey whose end they know they’ll never see.

It takes a particular sort of human to be a “word detective”: something between a linguistics academic, an archival historian, a journalist and an old-fashioned gumshoe. Though hardly without its tensions — corpus linguists versus old-school dictionary-makers, stats nerds versus scholarly etymologists — lexicography seems to be one specialist profession with a lingering sense of common purpose: us against that ever-expanding, multi-headed hydra, the English language. “It is pretty obsessive-compulsive,” Jane Solomon said.

The idea of making a perfect linguistic resource was one most lexicographers knew was folly, she continued. “I’ve learned too much about past dictionaries to have that as a personal goal.” But then, part of the thrill of being a lexicographer is knowing that the work will never be done. English is always metamorphosing, mutating, evolving; its restless dynamism is what makes it so absorbing. “It’s always on the move,” said Solomon. “You have to love that.”

There are other joys, too: the thrill of catching a new sense, or crafting a definition that feels, if not perfect, at least right. “It sounds cheesy, but it can be like poetry,” Michael Rundell reflected. “Making a dictionary is as much an art as a craft.”

Read the story

How Black Panther Asks Us to Examine Who We Are To One Another

Marvel Studios

Rahawa Haile | Longreads | February 2018 | 12 minutes (3,078 words)

(Spoiler alert! This essay contains numerous spoilers about the film Black Panther.)

By the time I sat down to watch Ryan Coogler’s Black Panther, a film about a thriving, fictional African country that has never been colonized, 12 hours had passed since the prime minister of Ethiopia resigned following years of protest and civil unrest. It would be another 12 hours before the country declared a state of emergency and enforced martial law, as the battle for succession began. Ethiopia has appeared in many conversations about Black Panther since the film’s release, despite an obvious emphasis on Wakanda, the Black Panther’s kingdom, being free of outside influences — and finances.

While interviews with Coogler reveal he based Wakanda on Lesotho, a small country surrounded on all sides by South Africa, it has become clear that most discussions about the film share a similar geography; its borders are dimensional rather than physical, existing in two universes at once. How does one simultaneously argue the joys of recognizing the Pan-African signifiers within Wakanda, as experienced by Africans watching the film, and the limits of Pan-Africanism in practice, as experienced by a diaspora longing for Africa? The beauty and tragedy of Wakanda, as well as our discourse, is that it exists in an intertidal zone: not always submerged in the fictional, as it owes much of its aesthetic to the Africa we know, but not entirely real either, as no such country exists on the African continent. The porosity and width of that border complicates an already complicated task, shedding light on the infinite points of reference possible for this film that go beyond subjective readings.
Read more…

The Internet Isn’t Forever

Illustration by Shannon Freshwater

Maria Bustillos | Columbia Journalism Review | February 2018 |2900 words (12 minutes)

This story is published in collaboration with the Columbia Journalism Review, whose Winter 2018 issue covers threats to journalism.

The Honolulu Advertiser doesn’t exist anymore, but it used to publish a regular “Health Bureau Statistics” column in its back pages supplied with information from the Hawaii Department of Health detailing births, deaths, and other events. The paper, which began in 1856 as the Pacific Commercial Advertiser, since the end of World War II was merged, bought, sold, and then merged again with its local rival, The Honolulu Star-Bulletin, to become in 2010 The Honolulu Star-Advertiser. But the Advertiser archive is still preserved on microfilm in the Honolulu State Library. Who could have guessed, when those reels were made, that the record of a tiny birth announcement would one day become a matter of national consequence? But there, on page B-6 of the August 13, 1961 edition of The Sunday Advertiser, set next to classified listings for carpenters and floor waxers, are two lines of agate type announcing that on August 4, a son had been born to Mr. and Mrs. Barack H. Obama of 6085 Kalanianaole Highway.

In the absence of this impossible-to-fudge bit of plastic film, it would have been far easier for the so-called birther movement to persuade more Americans that President Barack Obama wasn’t born in the United States. But that little roll of microfilm was and is still there, ready to be threaded on a reel and examined in the basement of the Honolulu State Library: An unfalsifiable record of “Births, Marriages, Deaths,” which immeasurably fortified the Hawaii government’s assertions regarding Obama’s original birth certificate. “We don’t destroy vital records,” Hawaii Health Department spokeswoman Janice Okubo says. “That’s our whole job, to maintain and retain vital records.” Read more…

The Only Downside to Lower Infant-Mortality Rates? All Those Baby Books

We typically think of narcissists as people with an inflated sense of their own uniqueness. If you’ve been around a parent in the past, say, century, you will have been subjected to a more peculiar type of narcissism: the one that assumes the universality of their highly anecdotal experience. (As a parent myself, I’m certainly part of the problem.) In the Guardian, Oliver Burkeman shows how the baby-advice literary-consumerist complex has capitalized on this tendency, producing book after book filled with often-useless, self-contradictory insights.

It’s no surprise, argues Burkeman, that this publishing explosion came about just as newborns’ chances of survival increased dramatically. The removal of most life-threatening circumstances from the experience of giving birth and raising an infant opened up the space for anxiety around the more trivial aspects of parenting.

Child mortality began to decline precipitously from the turn of the century, and with it, the life-or-death justification for this kind of advice. But the result was not a new generation of experts urging parents to relax, on the grounds that everything would probably be fine. (Books informed by 20th-century psychoanalysis, such as those by Benjamin Spock and Donald Winnicott, would later advise a far less rigid approach, arguing that a “good enough mother”, who didn’t always follow the rules perfectly, was perhaps even better than one who did, since that helped babies gradually to learn to tolerate frustration. But they were still half a century away.)

Instead, the anxiety that had formerly attached itself to the risk of a child dying took a more modern form: the fear that a baby reared with too much indulgence might grow up “coddled”, unfit for the new era of high technology and increasing economic competition; or even, as at least one American paediatrician warned, ripe for conversion to socialism. “When you are tempted to pet your child,” wrote the psychologist John Watson in 1928, in his book Psychological Care of Infant and Child, which was hardly idiosyncratic for its time, “remember that mother love is a dangerous instrument. An instrument which may inflict a never-healing wound, a wound which may make infancy unhappy, adolescence a nightmare, an instrument which may wreck your adult son or daughter’s vocational future and their chances for marital happiness.”

Thus began the transformation that would culminate in the contemporary baby-advice industry. With every passing year, there was less and less to worry about: in the developed world today, by any meaningful historical yardstick, your baby will almost certainly be fine, and if it isn’t, that will almost certainly be due to factors entirely beyond your control. Yet the anxiety remains – perhaps for no other reason than that becoming a parent is an inherently anxiety-inducing experience; or perhaps because modern life induces so much anxiety for other reasons, which we then project upon our babies. And so baby manuals became more and more fixated on questions that would have struck any 19th-century parent as trivial, such as for precisely how many minutes it’s acceptable to let babies cry; or how the shape of a pacifier might affect the alignment of their teeth; or whether their lifelong health might be damaged by traces of chemicals in the plastics used to make their bowls and spoons.

Read the story

Reclaiming Our Rage

(Raquel Minwell/EyeEm/Getty)

There’s a lot being written about women and anger right now and I am here for all of it.

Rebecca Traister, who is writing a book on the subject, recently posted a thread on Twitter pointing to a number of recent articles on women’s anger: “Does This Year Make Me Look Angry?,” by Ijeoma Oluo in Elle; “#MeToo Isn’t Enough. Now Women Need to Get Ugly,” by Barbara Kingsolver in the Guardian; “We are Living Through the Moment When Women Unleash Decades of Pent-Up Anger,” by Katha Pollitt in The Nation; “Most Women You Know Are Angry — And That’s Alright,” by Longreads columnist Laurie Penny in Teen Vogue.

But one piece she included resonated with me on a deeply personal level: “I Used to Insist I Didn’t Get Angry. Not Anymore,” by Leslie Jamison in The New York Times Magazine.

Jamison examines women’s long-standing conditioning against owning and expressing anger, instead sublimating their rage in sadness, which has historically been more acceptable. I know this mechanism all too well. It long ago became second nature for me to respond to affronts and offenses of all kinds by bursting into tears and withdrawing deep into sorrow rather than raging or even just speaking up for myself in a firm and reasonable way. In my 50s, I’m only first learning how to do the latter, and usually only after first defaulting to the emotional bypass toward crying instead. For so many of us — maybe for most women — this is a conditioning that is difficult to root out because of a culture that taught us our anger makes us threatening.

The phenomenon of female anger has often been turned against itself, the figure of the angry woman reframed as threat — not the one who has been harmed, but the one bent on harming. She conjures a lineage of threatening archetypes: the harpy and her talons, the witch and her spells, the medusa and her writhing locks. The notion that female anger is unnatural or destructive is learned young; children report perceiving displays of anger as more acceptable from boys than from girls. According to a review of studies of gender and anger written in 2000 by Ann M. Kring, a psychology professor at the University of California, Berkeley, men and women self-report “anger episodes” with comparable degrees of frequency, but women report experiencing more shame and embarrassment in their aftermath. People are more likely to use words like “bitchy” and “hostile” to describe female anger, while male anger is more likely to be described as “strong.” Kring reported that men are more likely to express their anger by physically assaulting objects or verbally attacking other people, while women are more likely to cry when they get angry, as if their bodies are forcibly returning them to the appearance of the emotion — sadness — with which they are most commonly associated.

A 2016 study found that it took longer for people to correctly identify the gender of female faces displaying an angry expression, as if the emotion had wandered out of its natural habitat by finding its way to their features. A 1990 study conducted by the psychologists Ulf Dimberg and L.O. Lundquist found that when female faces are recognized as angry, their expressions are rated as more hostile than comparable expressions on the faces of men — as if their violation of social expectations had already made their anger seem more extreme, increasing its volume beyond what could be tolerated.

Read the story