Search Results for: Guardian

True Roots

Daniel Berehulak/Getty Images

Ronnie Citron-Fink | True Roots | Island Press | June 2019 | 34 minutes (5.655 words)

 

How’d You do it? Are you doing that on purpose? Are you okay? Ever since I stopped coloring my silver hair, I’ve gotten a lot of questions. One of the most common during my hair transition was Why are you letting it go gray? While my roots didn’t ask permission before they stopped growing in dark brown, it was a complex mix of fear and determination that rearranged my beauty priorities. The question of why — why, after twenty-five years of using chemical dyes, I gave them up-is something I’ve thought about a lot.

My world began to shift four years ago. I was sitting in a meeting about toxics reform in Washington, DC, when an environmental scientist began to describe the buildup of chemicals in our bodies. As she rattled off a list of ingredients in personal care products-toluene, benzophenone, stearates, triclosan — my scalp started to tingle. “We’re just beginning to understand how these chemicals compromise long-term health,” she concluded.

Read more…

The Artificial Intelligence of the Public Intellectual

morkeman / Getty

Soraya Roberts | Longreads | May 2019 | 8 minutes (2,228 words)

“Well, that’s a really important thing to investigate.” While Naomi Wolf’s intellectual side failed her last week, her public side did not. That first line was her measured response when a BBC interviewer pointed out — on live radio — that cursory research had disproven a major thesis in her new book, Outrages: Sex, Censorship, and the Criminalization of Love (she misinterpreted a Victorian legal term, “death recorded,” to mean execution — the term actually meant the person was pardoned). Hearing this go down, journalists like me theorized how we would react in similar circumstances (defenestration) and decried the lack of fact-checkers in publishing (fact: Authors often have to pay for their own). The mistake did, however, ironically, offer one corrective: It turned Wolf from cerebral superhero into mere mortal. No longer was she an otherworldly intellect who could suddenly complete her Ph.D. — abandoned at Oxford when she was a Rhodes Scholar in the mid-’80s, Outrages is a reworking of her second, successful, attempt — while juggling columns for outlets like The Guardian, a speaking circuit, an institute for ethical leadership, and her own site, DailyClout, not to mention a new marriage. Something had to give, and it was the Victorians.

Once, the public intellectual had the deserved reputation of a scholarly individual who steered the public discourse: I always think of Oscar Wilde, the perfect dinner wit who could riff on any subject on command and always had the presence of mind to come up with an immortal line like, “One can survive everything nowadays except death.” The public intellectual now has no time for dinner. Wolf, for instance, parlayed the success of her 1991 book The Beauty Myth into an intellectual career that has spanned three decades, multiple books, and a couple of political advisory jobs, in which time her supposed expertise has spread far beyond third-wave feminism. She has become a symbol of intellectual rigor that spans everything from vaginas to dictatorships — a sort of lifestyle brand for the brain. Other thought leaders like her include Jordan Peterson, Fareed Zakaria, and Jill Abramson. Their minds have hijacked the public trust, each one acting as the pinnacle of intellect, an individual example of brilliance to cut through all the dullness, before sacrificing the very rigor that put them there in order to maintain the illusion floated by the media, by them, even by us. The public intellectual once meant public action, a voice from the outside shifting the inside, but then it became personal, populated by self-serving insiders. The public intellectual thus became an extension — rather than an indictment — of the American Dream, the idea that one person, on their own, can achieve anything, including being the smartest person in the room as well as the richest.

* * *

I accuse the Age of Enlightenment of being indirectly responsible for 12 Rules for Life. The increasingly literate population of the 18th century was primed to live up to the era’s ultimate aspiration: an increasingly informed public. This was a time of debates, public lectures, and publications and fame for the academics behind them. Ralph Waldo Emerson, for one. In his celebrated “The American Scholar” speech from 1837, Emerson provided a framework for an American cultural identity — distinct from Europe’s — which was composed of a multifaceted intellect (the One Man theory). “The scholar is that man who must take up into himself all the ability of the time, all the contributions of the past, all the hopes of the future,” he said. “In yourself slumbers the whole of Reason; it is for you to know all, it is for you to dare all.” While Emerson argued that the intellectual was bound to action, the “public intellectual” really arrived at the end of the 19th century, when French novelist Émile Zola publicly accused the French military of antisemitism over the Dreyfus Affair in an open letter published in  L’Aurore newspaper in 1898. With  “J’Accuse…!,” the social commentary Zola spread through his naturalist novels was transformed into a direct appeal to the public: Observational wisdom became intellectual action. “I have but one passion: to enlighten those who have been kept in the dark, in the name of humanity which has suffered so much and is entitled to happiness,” he wrote. “My fiery protest is simply the cry of my very soul.”

The public intellectual thenceforth became the individual who used scholarship for social justice. But only briefly. After the Second World War, universities opened up to serve those who had served America, which lead to a boost in educated citizens and a captive audience for philosophers and other scholars. By the end of the ’60s, television commanded our attention further with learned debates on The Dick Cavett Show — where autodidact James Baldwin famously dressed down Yale philosopher Paul Weiss — and Firing Line with William F. Buckley Jr. (also famously destroyed by Baldwin), which would go on to host academics like Camille Paglia in the ’90s. But Culture Trip editor Michael Barron dates the “splintering of televised American intellectualism” to a 1968 debate between Gore Vidal — “I want to make 200 million people change their minds,” the “writer-hero” once said — and Buckley, which devolved into playground insults. A decade later, the public intellectual reached its celebrity peak, with Susan Sontag introducing the branded brain in People magazine (“I’m a book junkie. … I buy special editions like other women shop for designer originals at Saks.”)

As television lost patience with Vidal’s verbose bravado, he was replaced with more telegenic — angrier, stupider, more right-wing — white men like Bill O’Reilly, who did not clarify nuance but blustered over the issues of the day; the public intellectual was now all public, no intellect. Which is to say, the celebrity pushed out the scholar, but it was on its way out anyway. By the ’80s, the communal philosophical and political conversations of the post-war era slunk back to the confines of academia, which became increasingly professionalized, specialized, and insular, producing experts with less general and public-facing knowledge. “Anyone who engages in public debate as a scholar is at risk of being labelled not a serious scholar, someone who is diverting their attention and resources away from research and publicly seeking personal aggrandizement,” one professor told University Affairs in 2014. “It discourages people from participating at a time when public issues are more complicated and ethically fraught, more requiring of diverse voices than ever before.” Diversity rarely got past the ivy, with the towering brilliance of trespassers like Baldwin and Zora Neale Hurston, among other marginalized writers, limited by their circumstances. “The white audience does not seek out black public intellectuals to challenge their worldview,” wrote Mychal Denzel Smith in Harper’s last year, “instead they are meant to serve as tour guides through a foreign experience that the white audience wishes to keep at a comfortable distance.”

Speaking of white audiences … here’s where I mention the intellectual dark web even though I would rather not. It’s the place — online, outside the academy, in pseudo-intellectual “free thought” mag Quillette — where reactionary “intellectuals” flash their advanced degrees while claiming their views are too edgy for the schools that graduated them. These are your Petersons, your Sam Harrises, your Ben Shapiros, the white (non)thinkers, usually men, tied in some vague way to academia, which they use to validate their anti-intellectualism while passing their feelings off as philosophy and, worse, as (mis)guides for the misguided. Last month, a hyped debate between psychology professor Peterson and philosopher Slavoj Žižek had the former spending his opening remarks stumbling around Marxism, having only just read The Communist Manifesto for the first time since high school. As Andray Domise wrote in Maclean’s, “The good professor hadn’t done his homework.” But neither have his fans.

But it’s not just the conservative public intellectuals who are slacking off. Earlier this year, Jill Abramson, the former executive editor of The New York Times, published Merchants of Truth: The Business of News and the Fight for Facts. She was the foremost mind on journalism in the Trump era for roughly two seconds before being accused of plagiarizing parts of her book. Her response revealed that the authorship wasn’t exactly hers alone, a fact which only came to light in order for her to blame others for her mistakes. “I did have fact-checking, I did have assistants in research, and in some cases, the drafting of parts of the book,” she told NPR. “I certainly did spend money. But maybe it wasn’t enough.” Abramson’s explanation implied a tradition in which, if you are smart enough to be rich enough, you can pay to uphold your intellectual reputation, no matter how artificial it may be.

That certainly wasn’t the first time a public intellectual overrepresented their abilities. CNN host Fareed Zakaria, a specialist in foreign policy with a Ph.D. from Harvard — a marker of intelligence that can almost stand in for actual acumen these days — has been accused multiple times of plagiarism, despite “stripping down” his extensive workload (books, speeches, columns, tweets). Yet he continues to host his own show and to write a column for The Washington Post in the midst of a growing number of unemployed journalists and dwindling number of outlets. Which is part of the problem. “What happens in the media is the cult of personality,” said Charles R. Eisendrath, director of the Livingston Awards and Knight-Wallace Fellowship, in the Times. “As long as it’s cheaper to brand individual personalities than to build staff and bolster their brand, they will do it.” Which is why Wolf, and even Abramson, are unlikely to be gone for good.

To be honest, we want them around. Media output hasn’t contracted along with the industry, so it’s easier to follow an individual than a sprawling media site, just like it’s easier to consult a YouTube beauty influencer than it is to browse an entire Sephora. With public intellectuals concealing the amount of work required of them, the pressure to live up to the myth we are all helping to maintain only increases, since the rest of us have given up on trying to keep pace with these superstars. They think better than we ever could, so why should we bother? Except that, like the human beings they are, they’re cutting corners and making errors and no longer have room to think the way they did when they first got noticed. It takes significant strength of character in this economy of nonstop (and precarious) work to bow out, but Ta-Nehisi Coates did when he stepped down last year from his columnist gig at The Atlantic, where he had worked long before he started writing books and comics. “I became the public face of the magazine in many ways and I don’t really want to be that,” he told The Washington Post. “I want to be a writer. I’m not a symbol of what The Atlantic wants to do or whatever.”

* * *

Of course a public intellectual saw this coming. In a 1968 discussion between Norman Mailer and Marshall McLuhan on identity in the technology age (which explains the rise in STEM-based public intellectuals), the latter said, “When you give people too much information, they resort to pattern recognition.” The individuals who have since become symbols of thought — from the right (Christina Hoff Sommers) to the left (Roxane Gay) — are overrepresented in the media, contravening the original definition of their role as outsiders who spur public action against the insiders. In a capitalist system that promotes branded individualism at the expense of collective action, the public intellectual becomes a myth of impossible aspiration that not even it can live up to, which is the point — to keep selling a dream that is easier to buy than to engage in reality. But an increasingly intelligent public is gaining ground.

The “Public Intellectual” entry in Urban Dictionary defines it as, “A professor who spends too much time on Twitter,” citing Peterson as an example. Ha? The entry is by OrinKerr, who may or may not be (I am leaning toward the former) a legal scholar who writes for the conservative Volokh Conspiracy blog. His bad joke is facetious, but not entirely inaccurate — there’s a shift afoot, from the traditional individual public intellectual toward a collective model. That includes online activists and writers like Mikki Kendall, who regularly leads discussions about feminism and race on Twitter; Bill McKibben, who cofounded 360.org, an online community of climate change activists; and YouTubers like Natalie Wynn, whose ContraPoints video essays respond to real questions from alt-right men. In both models, complex thought does not reside solely with the individual, but engages the community. This is a reversion to one of the early definitions of public intellectualism by philosopher Antonio Gramsci. “The traditional and vulgarized type of the intellectual is given by the man of letters, the philosopher, the artist,” he wrote in his Prison Notebooks — first published in 1971. “The mode of being of the new intellectual can no longer consist in eloquence, which is an exterior and momentary mover of feelings and passions, but in active participation in practical life, as constructor, organizer, ‘permanent persuader’ and not just a simple orator.” It doesn’t matter if you’re the smartest person in the room, as long as you can make it move.

* * *

Soraya Roberts is a culture columnist at Longreads.

House Un-American

Bettmann / Getty, House photo courtesy of Author, Collage by Homestead

Leslie Kendall Dye | Longreads | June 2019 | 24 minutes (6,524 words)

 

They say you can’t go home again, but I never stop trying. Sometimes I conjure the scent of jacaranda trees mixed with swimming pool chlorine, the sweet-then-sour first bite of kumquats, the faces of the little foxes in the bushes, the gleam of their eyes in the dark. The longer I live outside of Los Angeles, the more its mysteries call to me, as though the city itself were a piece of unfinished business. Maybe “unfinished business” is the very definition of home.

Read more…

Total Depravity: The Origins of the Drug Epidemic in Appalachia Laid Bare

Getty / Black Inc. Books

Richard Cooke | Excerpt from Tired of Winning: A Chronicle of American Decline | Black Inc. Books | May 2019 | 21 minutes (5,527 words)

They shall take up serpents; and if they drink any deadly thing it shall not hurt them; they shall lay hands on the sick, and they shall recover.

Mark 16:18

One night John Stephen Toler dreamed that the Lord had placed him high on a cliff, overlooking a forest-filled valley. He had this vision while living in Man, West Virginia, where some of the townsfolk thought he was a hell-bound abomination; he countered that God works in different ways. The mountains were where he sought sanctuary, so he felt no fear; but as he watched, all the trees he could see were consumed by wildfire. It was incredible, he said, to see ‘how quick it was devoured’, and the meaning of the parable was clear. The forest was Man and the fire was drugs, and when the drugs came to Man, that was exactly how it happened – it was devoured ‘so fast, that you didn’t even see it coming’, he said. We were in Huntington, West Virginia, and by now John Stephen Toler was in recovery.

Read more…

Becoming Family

Illustration by Tom Peake

Jennifer Berney | Longreads | May 2019 | 16 minutes (4,486 words)

 
“He’s really cute,” my partner Kellie whispered to me, moments after our first son arrived. He had a head of black hair and a pug nose. His eyes were alarmingly bright. Kellie rested one hand on the top of his head as he lay across my chest. “So cute,” she said.

Her declaration meant something to me. Because the baby wasn’t of her body, because he was of my egg and my womb and a donor’s sperm, I’d been haunted by the worry that she’d struggle to claim him as hers — that he’d seem to her like a foreign entity, like someone else’s newborn, red-faced and squirming.

Hours later, in the middle of the night, a nurse came into our room, tapped Kellie on the shoulder, and asked her to bring our newborn to the lab for a routine test. Kellie cradled the baby as the nurse poked his heel with a needle and squeezed drops of his blood onto a test card. Our baby, who was still nameless, wailed and shook. In that moment, she tells me, she was overwhelmed by biology, by the physical need to protect a tiny life.

* * *

In the fourth century BCE, Aristotle proposed a theory of reproduction that would persist for thousands of years. It’s a theory that, while scientifically inaccurate, still informs our cultural thinking about parenthood.

According to Aristotle, the man, via intercourse, planted his seed in the woman’s womb. The woman’s menstrual blood nourished that seed and allowed it to grow. She provided the habitat, he supplied the content. The resulting child was the product of the father, nourished by the mother.

When it came to parenthood, the woman’s essential role was to nurture what the man had planted within her. To father was simply to provide the material — a momentary job. Fathering was ejaculating. But mothering was nurturing. This job was ongoing, never-ending. Her care began at the moment of conception and continued into adulthood and beyond.

* * *

When Kellie and I came home from the hospital with our newborn, our house felt strangely quiet and bare. In the days preceding delivery, Kellie had cleaned and organized as a way of getting ready for the baby, and our house was now unusually tidy. We sat on the couch with our sleeping baby and admired him. We smoothed his hair so that it crested at the center of his forehead, Napoleon style, then we smoothed it to the side. We said his name — West — over and over, trying to teach ourselves the word for this new being. Every so often he twitched. I had the sense that our world was about to transform, that the quiet of the first newborn days was temporary.

In the days that followed, I roamed the house in mismatched pajamas and snacked on casseroles that friends had brought over. I nursed the baby and rocked the baby and watched the baby while he slept. Meanwhile, Kellie, wearing her daily uniform of work pants and a worn-out T-shirt, built walls around our back porch to create a mudroom for our house. In the months leading up to our baby’s birth, we’d agreed that our dogs would need such a room, a place set away from a baby who would one day be crawling and drooling and grabbing, and so we called Jesse — a carpenter acquaintance whom we had once, long ago, asked to be our donor, and who had considered it for two months before turning us down. He wasn’t game to donate sperm, but he was game to bang out some walls. All day, I heard Kellie and Jesse’s hammering and muffled conversation.

In this way we entered parenthood. I was the full-time nurser and the guardian of sleep; Kellie was the builder, the house-maintainer. At night, the baby slept between us.

* * *

The idea that paternity is primarily a genetic contribution, that a father’s role is simply to provide the seed, is a very stubborn one. An absent father is still considered a father. When we use father as a verb, we usually mean the physical act of conception, while to mother more often describes the act of tending to. When a father takes on some of the active parenting, when he drives the kids to school or makes them breakfast, we often refer to these acts as “helping,” as if he were doing tasks assigned to someone else. “He’s a good father,” I’ve heard people say, bemoaning a wife’s lack of gratitude. “He helps.”

“Who’s the dad?” is a question friends of friends ask at parties when they learn that my children have two mothers. It’s a question that distant relatives ask, eager for the inside scoop.

The idea that my son doesn’t have a dad, that it is indeed possible to not have a father, is a hard thing for people to wrap their minds around. They may understand the process of donor insemination, but still, they think, because conception requires sperm, every child must have a father. Even for me, it creates a kind of cognitive dissonance. When I say that my child has no father, I feel like I’m not telling the whole truth.

“Why doesn’t West have a father?” a wide-eyed boy asked me one day as he sat at a classroom table with West and three other first graders. I was helping them make illustrated pages, and somehow the topic of our family had come up. West looked at me anxiously.

“He has two moms,” I told the boy.

“But why?” he insisted.

Of the kids I knew in West’s class, one was being raised by grandparents and several more had stepparents or were being raised by a single mom. But I could see that our situation was the most confounding.

“That’s just the way our family works,” I said before rattling the crayon box and offering it around the table. The curious boy did not look satisfied, and West remained steady and silent.

* * *

* Some names have been changed to protect the privacy of individuals.

By the time our donor, Daniel*, met our baby, he and his wife Rebecca had a baby of their own and had resettled on the other side of the state. We met them at a pizza place on a weekday afternoon. It was spring in the Pacific Northwest and the sun glared on fresh puddles. They had come to town to visit family and meet with longtime friends who wanted to meet their new child. At the time, our relationships with one another were still undefined, and we counted more as friends than family.

I remember that meeting in fragments, like bits of color held up to the light: Trays of half-eaten pizza. Plastic cups filled with ice water. Rebecca holding her newborn, Wren, against her, a burp cloth draped over her shoulder. Wren’s bare baby feet and the creases in his chubby ankles. My own baby, old enough to crane his head, looking around with wide eyes and a two-tooth smile. All of us in constant motion — standing to rock the baby, sitting to feed the baby, slipping into the bathroom to change the baby’s wet diaper. We passed our babies from one parent to the other, then across the table. We lifted the babies, assessing their heft, then tried to meet their eyes so that we could bombard them with smiles.

I remember it this way: We were neither distant nor close, neither awkward nor easy. We’d all been remade by parenthood, and it was like we were meeting for the first time.

I had wondered before our meeting if West, at 6 months old, would connect to Daniel especially, if there really was some magic carried in their shared DNA, if our son would recognize him, cling to him, fall asleep against his chest. But he didn’t. West greeted Daniel with joyful curiosity, the same way he greeted any stranger, and then returned to my arms to nurse.

* * *

Several months later, Kellie and I drove six hours across the state — baby tucked in his infant car seat in the back — to meet Daniel and Rebecca again, in their new home.

The fog of new parenthood had lifted, and this time, the ease between us was instant. Rebecca and I each claimed a spot at her kitchen table, sat with coffee, and watched as our children chewed on toys and pulled themselves across the wood floors. Conversation between us was continuous. We found a rhythm of interrupting one thought with another, then picking up where we left off, all the while tending to our babies as needed — rising to lift and nurse them, to change a diaper on the floor, to pull a board book from a mouth. Time with Rebecca was a respite from the solitude and repetition of early motherhood, a dose of medicine I needed.

So I found something deeply healing in having an extended family that was at once chosen, but also truly family, tied by blood.

Kellie and Daniel found their places just as easily. They spent their time rewiring Daniel’s carpentry studio, or salvaging beams from a nearby teardown, or driving to the forest to cut up fallen trees for firewood. Each of them, I imagine, had experienced their own kind of solitude as they watched their partners devote themselves fully to another human, and they both, I imagine, felt relief in working side by side.

We became parallel, symbiotic. Two families on either side of the Cascade Mountains. Sometimes they traveled to us; other times we traveled to them. Our boys knew and remembered each other. They splashed each other in a steel trough in Daniel and Rebecca’s backyard, climbed trees that had grown sideways over the shore of Puget Sound, built forts together out of cardboard in our kitchen.

The beauty of our new extended family had little to do with anything we had asked for or planned. Two years earlier, a friend had suggested that Kellie and I ask Daniel to consider being our donor. We had met him only a handful of times, but we knew that we liked him. He was strong but soft-spoken, handsome but unassuming. We were nervous to ask him. We’d explored the prospect with several men already — with Jesse the carpenter, with a coworker, with other peripheral friends — but two ghosted, one said no, and another seemed to think that the resulting child would be his own. Daniel turned out to be different. When he and Rebecca showed up at our house to discuss the possibility, it seemed he was already clear. “What kind of involvement would you want?” he asked us. We had agreed only to stay in some kind of touch over the years, to not become strangers to one another.

And yet we wound up with something I’d never had and never would’ve thought to plan for. I grew up with cousins, but none my age. They were five years older, or 12 years older, or three years younger, or 20 years younger. They were also scattered far and wide across the country. My brother was seven years younger than me, and my half-siblings were so much older that they were almost like aunts and an uncle. So I found something deeply healing in having an extended family that was at once chosen, but also truly family, tied by blood.

Or was it even blood that tied us? In theory, we wanted to know Daniel forever because questions might arise about the DNA he’d shared with us. We might someday need to ask him about some rare disease or mental illness, to probe beyond the brief set of questions we’d asked over dinner that first night we talked. And then there was the way we’d been trained to see blood as a legitimizing factor, trained to understand that blood equals family. Like many queer families, Kellie and I, while challenging this notion, unconsciously embraced it. Daniel was blood-tied to our children and therefore he was kin.

But, even more than blood, it was fate that tied us. It was like that film cliché where one stranger saves another’s life and they are therefore bound to each other forever. Rebecca and Daniel had agreed to help us build a family, and their choice had a moral weight. Gratitude would forever bind me to them. The love that I felt for West contained a love for them. I couldn’t imagine it any other way.

So it made sense to me when, four years after we’d first shared a meal and talked about becoming family, three years after our sons were born, Rebecca called us to ask if we’d considered having another baby. We had.

“Do you guys want to get pregnant again?” Rebecca asked me that day on the phone. “Because, you know, we are.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


We went to visit them two weeks later and stayed in a motel two miles away. On our first morning, Kellie woke up before me and left in search of coffee. She came back with two paper cups filled with coffee, and also a small mason jar that held a quarter inch of semen. Later she showed me the text that Rebecca had sent: “Good morning! Donation is ready. Cum on over.”

Rebecca delivered a second son, Ryan, in November. I delivered a second son, Cedar, in January.

* * *

I am a gestational and biological parent. Kellie is an adoptive parent. We come to our roles differently.

That I gestated and breastfed my sons carries immediate, clear meaning for me. When they were babies, my smell, my voice, my touch meant sustenance. Kellie held them and bathed them and changed them, but she did not offer milk. In the middle of the night, it was my body they reached for. My role as gestational parent had immediate consequence: for the first three years, my children’s need for me was more urgent, more connected to their survival.

The other difference, the difference of biology, is far less clear. What does it mean to my family that Kellie shares no DNA with our children? Does it mean next to nothing? Or does it mean more than I want to admit?

In the 10 years that Kellie and I have raised children together, I’ve avoided asking her how she feels about being the adoptive parent. I’ve avoided it because I was afraid — afraid that she would confide that our children never fully felt like her own. I’ve been worried she might say that they felt more like small people she lived with and cared about, but that if our own relationship ended she wouldn’t know exactly where they fit.

In my own community of lesbians, there’s a legacy of loosely defined second parents. I know a number of women who conceived in the ’80s (back when artificial insemination was just beginning to be available to lesbians) and planned to be single parents. But then, during pregnancy or early in the child’s life, a partner entered the picture, stayed for a year or two, then left. The partner had no legal claim to the child, but in many cases continued to parent from a distance. I’ve spoken to some of their children — grown now — who have trouble defining their role with a single term. “She’s certainly my other parent,” one of them told me over beers, then went on to explain that the word Mom doesn’t feel right when her gestational mom “did every load of laundry, packed every lunch, and cooked every meal.”

“We had no blueprint,” she told me. “She was kind of like a weekend dad.”

Though Kellie is much more than a weekend dad, I’ve long worried about the ways in which her role as other-mother remain ambiguous and undefined.

“I feel like they’re mine” is the first thing Kellie told me when I finally summoned the nerve to ask her. But sometimes she worries that if I died, the world would not recognize her as a parent, and that our own kids might reject her. She feels secure in her own attachment, but the role the world assigns her is a tenuous one.

What does it mean to my family that Kellie shares no DNA with our children? Does it mean next to nothing? Or does it mean more than I want to admit?

In her book Recreating Motherhood, Barbara Katz Rothman writes that the value our society places on genetic relationships is inherently patriarchal, tied to our initial false belief — based in Aristotle’s “flowerpot theory” — that men were the sole genetic contributors. Because the child was of the man, he belonged to the man. Once we recognized that mothers contribute half of the genetic material, we began to see mother and father as having equal claim to their child. Rothman asserts that this is still an inherently patriarchal position, one in which blood ties indicate a kind of ownership, and one in which the work of nurturance is not accounted for.

In our own contemporary culture, we may sometimes act as though we value nurture over nature. These days I see the truism “love is love” everywhere I turn — on signs, in social media, spoken aloud by celebrities and friends. The statement suggests that love alone is the element that legitimizes a couple or a family. Still, we track our ancestry and meet new genetic relatives — strangers whom we’ve been told are family — through services like 23andMe, and we marvel at the overlapping traits and mannerisms of close relatives raised apart from one another.

We’ve learned to be careful, when speaking of adoptees, to use terms like “birth mother” instead of “real mother,” acknowledging that genes and gestation are not the only thing that make a parent real. And yet, when someone does say “real mother,” we know exactly what they mean.

“Kellie’s not your real mom,” a neighborhood kid once told Cedar, who stood there agape because he had not yet thought to wonder too hard about his origins. At the time, he already understood that his family was different. When other people asked about his father, he had learned to explain, “I have two moms.” But as far as I could tell, this was the first moment someone had invited him to wonder about the actual legitimacy of his family — its realness.

* * *

Rebecca and I are tied by blood tangentially, but not directly. Our children are blood-related. She and I are not. Still, she feels more like family than many of my actual blood relations. Rebecca’s sister and nieces feel like family too, though they are not tied to my family by heredity. We live in the same community, so when Rebecca and Daniel come to town we have large family get-togethers: picnics at parks and birthday celebrations at restaurants. Sometimes Rebecca’s mom joins us too. When we meet she always hugs me and says my first name sweetly. She knows about what ties us, and so she feels tied to me too.

Meanwhile, Daniel’s family of origin is a mystery to me, for reasons of geographical distance and family culture. I see pictures of his relatives on Facebook and have to remind myself that his kin are also my children’s blood kin. My children’s faces may grow to bear resemblance to the faces I see in these photos: the long jawline, the aquiline nose. Or, pieces of these relatives’ histories may give clues to my own children’s futures — special talents and obsessions, illnesses and struggles. Even when I remind myself of this, it feels distant, hard to reach.

Why do I look so hard to find my reflection in blood kin, as if seeing myself in my ancestors will somehow legitimize me?

Kin: Your mother who birthed and nursed you, your father who bore witness to your childhood. Your grandmother who let you sleep beside her in the bed when you came to visit. Your aunt who drove you to her home for long weekends, where you lay alongside her golden retriever and looked at the forest through her windows.

Kin: The grandfather you never met who was a ne’er-do-well, whose legacy is a stack of letters and a rainbow painted on a barn. The uncle who joked around with you in childhood, but became distant as you got older. Your second cousin who discovered you online and now sends you a Christmas card every year.

Kin: Your brother who you speak to only a few times a year, but who you carry in your heart. Your aunt by marriage (then lost through divorce) who delighted you with her easy brand of sarcasm.

Kin: The cousins you’ve only met once or twice in a lifetime. When you see photos of them, some of them look like people you might easily know. Others look like strangers, like someone you might pass in a grocery store and immediately forget.

* * *

Kellie told me once that she hesitates when telling our kids about her family’s history. It’s not quite clear to her: Is her history their history, or is it something else? Long before she spoke this aloud to me the same question hung in my mind. Does her history matter to our kids because it’s their mother’s history, or because it is, somehow, their own?

When I look at my own ancestral family photos, I seek clues to who I am, traces of a self that predate me. Are these connections real, I wonder, or are they lore? Why does ancestral connection hold a sense of magic? Why do I look so hard to find my reflection in blood kin, as if seeing myself in my ancestors will somehow legitimize me?

And yet it turns out that some of my ancestors are not related to me genetically any more than Kellie is genetically related to our sons. Over the course of generations, our genetic ties to individual ancestors dissolve. Geneticist Graham Coop writes that if you trace your genetic heritage, after seven generations “many of your ancestors make no major genetic contribution to you.” In other words, your cells carry no trace of their DNA. They are no longer your genetic relatives, and yet they are still, of course, your ancestors. “Genetics is not genealogy,” he writes.

What if, more than heredity, families are really a collection of stories, some of them spoken, some of them withheld? Kellie’s ancestors were pioneers. My boys spent the first years of their lives in a house that her grandfather and great-grandfather built together. Kellie spends most of her free time splitting wood, building fences and sheds, capturing bee swarms. Cedar can now spot a swarm from a great distance. West is learning to measure wood and use a chop saw. They may one day raise their own families on the same land they grew up on. They may add new walls, new buildings, new fixtures. They do not require Kellie’s genes to carry on her legacy.

* * *

Four years after West was born, he asked me where he came from. It was a bright summer day and his brother — a baby then — was on a walk with Kellie, strapped against her chest. We were staying at a ranch in Colorado and the land was expansive: trails that went over bare hills and into forests, rocks and brush under wide blue sky. That afternoon West and I were inside our dark cabin, with light streaming through the windows and making patches on the floor.

I asked if he wanted to know who his donor was. “Do you want to guess?” I asked him. I was curious to see if he already had a sense.

“JoAnn?” he said, referring to a close family friend.

“The person who helped us is a man,” I said.

“Oh right,” he said. He thought and guessed some more, until I finally told him.

“It’s Daniel,” I said. “Wren’s dad.”

I watched him closely to see how he’d respond, but I detected neither joy nor surprise nor disappointment.

“Did Daniel help make Cedar too?”

“Yes,” I said.

He smiled. It didn’t surprise me that this was the thing that mattered to him — that he and his brother had the same origin story, that he wasn’t alone in the world.

* * *

We tend to understand our DNA as a simple blueprint for who we are and what we might become. We see experience as the tool that can push a person toward or away from their full potential, yet we see the potential itself as innate and fixed.

But in truth DNA and experience interact with each other. The field of epigenetics tells us that genes are turned on and off by experience, that the food we consume, the air we breathe, and how we are nurtured help determine which genes are expressed and which ones are repressed. Our DNA coding isn’t static. For instance, drinking green tea may help regulate the genes that suppress tumors. A sudden loss may trigger depression. And the amount of nurturing and physical contact a child receives in the early years may help determine whether or not he’ll suffer from anxiety as an adult. Currently researchers are investigating to what degree trauma in one person’s experience can cause a change in DNA that is transmitted from one generation to the next. Experience might become a legacy carried in blood.

Frances Champagne, a psychologist and genetic researcher, writes that “tactile interaction,” physical contact between parent and child, “is so important for the developing brain.” Her research shows that “the quality of the early-life environment can change the activity of genes.”

When Kellie held our newborn sons against her chest, when she bounced them and rocked them until they slept, she was not simply soothing them in the moment. She was helping program their DNA, contributing to their genetic legacy. Parents, through the way they nurture, contribute to the child’s nature. There is no clear line between the two.

* * *

In her memoir on adoption, Nicole Chung discusses the concept of family lineage and writes that she has been “grafted” onto her adoptive parents’ family tree. The graft strikes me as an apt metaphor. The scion is not of the receiving tree, and yet it is nourished and sustained by the tree. In the process of grafting, the tree is changed. The scion is changed. Through a process called vascular connection, they become one body.

The rootstock does not automatically reject the scion. The human body does not automatically reject an embryo conceived with a donor egg and sperm. A baby is comforted by warm skin, a smell, a heartbeat. A body loves a body. The baby may care that the source is familiar, but not that the DNA matches his own.

When Kellie’s mother visits with us, she often compares our boys to other members of her family. “It’s funny how Cedar’s blonde just like Noah, and wild like him too,” she’ll say, or, “West’s eyes are that same shade of hazel your grandpa’s were.”

I used to think she was forgetting that our children are donor conceived, or maybe just being silly. Now I realize it’s the opposite. Kellie’s mother doesn’t forget. She knows. She’s claiming them: tying her family’s present, past, and future, like stringing lights around the branches of her family tree, affirming that we belong to one another.
 

Jennifer Berney’s essays have appeared in the New York Times, The Offing, Tin House, and Brevity. She is currently working on a memoir that examines the patriarchal roots of the fertility industry, and the ways that queer families have both engaged with and avoided it.

* * *

Editor: Cheri Lucas Rowlands
Copy editor: Jacob Gross

Falling Stars: On Taking Down Our Celebrity Icons

Illustration by Homestead

Soraya Roberts | Longreads | May 2019 | 7 minutes (1, 868 words)

The shorthand iconography of the star has been the iconography of excess — furs, gold, pearls, diamonds, stacks of cash, lots of lights, lots of people. It’s luxury personified, the human being at its apex, the kind of intermediary between gods and humans that the ancient Egyptians didn’t just dress with jewels, but buried with them, transcending mortality. And who doesn’t want to be immortal? Especially these days, when we are very much the opposite: when aspiration has been replaced with desperation and extinction is the inevitable end, or maybe hell, but definitely not heaven. The old accoutrements of success, the ones that defined celebrity — wealth, power, decadence — are going extinct too. And anyone who continues to buy into them, is either performing satire (see Billy Porter in city-spanning golden wings) — or is, well, Drake.

The “God’s Plan” singer, who upon last estimation was worth around $90 million, unveiled his own private Boeing 767 cargo plane, Air Drake, in an Instagram video last week, a pair of praying hands on the tail fin speaking for us all. “No rental, no timeshare, no co-owners,” he said. No reality check either, apparently. While Drake framed it as his way of supporting a homegrown business (Ontario’s Cargojet), his very own “Heat of the Moment” lyrics — “All the niggas we don’t need anymore / And all the cops are still hangin’ out at the doughnut shops / Talkin ’bout how the weather’s changin’ / The ice is meltin’ as if the world is endin’” — caused a number of people to point out his hypocrisy. (He captioned the video, “Nothing was the same for real,” which I don’t believe is a reference to the planet’s demise, but maybe he was being meta.) It had been only seven months since Kanye and Kim Kardashian West were vilified for flying aboard a 660-seater Boeing. Basically alone. “No big deal,” Kardashian West said on Instagram. “Just like a chill room. This is, like, endless.” No, there’s an end. Their chill trip happened less than two months after the end days climate report came out.

At one point these stars were icons of the kind of success we aspired to. But having seen how the old capitalist system they symbolize has destroyed the world, the movement to destabilize it has also become a movement to destabilize them as its avatars. This includes idols of technology like Mark Zuckerberg, the once-envied wunderkind who is now someone who should be held “accountable”; business giants like Disney CEO Bob Iger, whose compensation is “insane” according to one member of the family dynasty; and political stars like Pete Buttigieg and Beto O’Rourke, both of whom were called out for their campaigns’ big donors. In our culture today, the guy who makes music out of his closet has the No. 1 song on the Billboard Hot 100 chart and the revolutionaries are schoolchildren. “The star is meant to epitomize the potential of everyone in American society,” writes P. David Marshall in Celebrity and Power: Fame in Contemporary Culture. “The dialectical reality is that the star is part of a system of false promise in the system of capital.”

* * *

The debate over whether success should be defined by wealth goes as far back as civilization itself. I asked my brother, a philosophy professor specializing in the ancients (I know), when it first turned up in the literature, and he told me it was “the base note” through most of Plato. Then there was Socrates, who thought knowledge, not wealth, should be the marker of success, versus Aristotle, who thought wealth was essential to the good life. Regardless of their differences, greed, my brother said, was almost always considered pathological. But then along came capitalism, which was popularized (peut-être) by French socialist Louis Blanc, who wrote Organisation du Travail, in which he defined it as “the appropriation of capital by some to the exclusion of others.” Within capitalism, greed became associated with productivity, which was correlated with a successful economy, and so greed was good (you try not to quote Gordon Gekko!). Along with it, those who were greedy were accepted, even admired, under certain conditions. A 2015 study had a bunch of U.K. teenagers excusing Bill Gates’s extreme wealth (more than $100 billion) as merit-based, the necessary evil of a capitalist system in which a hard-working individual can triumph the way they would like to one day.

The celebrity is the ultimate symbol of success, which, under capitalism, becomes the ultimate symbol of greed. “Celebrities reinforce the conception that there are no barriers in contemporary culture that the individual cannot overcome,” writes Marshall. And though Julius Caesar ended up on a coin, dating the monetization of fame back to ancient Rome, you can blame the French Revolution for a modern star like James Charles, who launched a YouTube channel of makeup tutorials at age 16 and within four years had more than 1.7 billion views. After the monarchy was overthrown, power and fame no longer required inheritance, which is why celebrity is sometimes (erroneously) associated with rebellion. But while the common man was ascending, so was individualism, along with mass media and the industrial revolution. The lord and serf were replaced by the businessman and employee and bourgeois culture expanded at the expense of its working-class analog. The icon of this new capitalist society, which had been weaned on the Romantic Era’s cult of personality, was the commodified individual who reinforced consumption: the celebrity. As Milly Williamson explains in Celebrity: Capitalism and the Making of Fame, “Celebrity offers images of inclusion and plenty in a society shaped by exclusion and structured in want.”

Is anyone playing the Kim Kardashian: Hollywood game anymore? The object was to use anything you had access to, whether material, money, or people, to advance. It was clearly a meta-tongue-in-cheek bit of cutesy puff, but it also wasn’t. Kim Kardashian West is you in the game and you in real life. Consumerism isn’t just consumption, it’s emulation. We consume to improve ourselves as individuals — to make ourselves more like Kardashian West, who is presented as the pinnacle of success — as though our self-actualization were directly associated with our purchasing power. And the same way we have commodity selves (I am Coke, not Pepsi; Dell, not Mac) we have celebrity selves. For instance, I’m a Winona Ryder person, not a Gwyneth Paltrow person (is anyone?). So my identity could very well be solidified based on whether I can find that Tom Waits shirt she always wears. And in these days of faces of brands, shaping yourself around Kim Kardashian West can actually mean shaping yourself around a $15,000 dress. “It is pointless to ask what Kim Kardashian does to earn her living: her role is to exist in our minds,” writes George Monbiot in The Guardian. “By playing our virtual neighbour, she induces a click of recognition on behalf of whatever grey monolith sits behind her this week.”

So who cares, right? So what if I want to be a $5,000 Louis Vuitton bag slung over Michelle Williams’s shoulder? It’s a little limiting, I guess, but fine (maybe?) — if we can trust the world to run fairly around us. According to a 2007 study in the International Journal of Cultural Studies, Brits who closely followed celebrity gossip over other types of news were half as likely to volunteer, less politically engaged, and the least likely to vote or protest. “It’s the capacity of these public figures to embody the collective in the individual,” writes Marshall, “which identifies their cultural signs as powerful.” It also identifies them as inert proxies for real community action. There is a veneer of democracy to consumerism, in that we are free to choose what we buy. But we are exercising our freedom only through buying (never mind that the options aren’t infinite); we are not defined as citizens, but as consumers. That the consumer has eclipsed the citizen explains in part why the appeals around climate change have been increasingly directed at the individual, pointing out how they will personally suffer if the world around them does — in a sea of individuals, the planet’s distress was not impetus enough. “The most important democratic achievements have been the result of working-class struggle and collective movements,” writes Williamson. “What is really extraordinary about working-class identity is not the potential celebrity in each of us, but precisely the solidarity and collectivity that is largely hidden from media representations of ordinary people.”

* * *

When Time released its list of the 100 most influential people in the world last month, I noticed that under the Icons category one of the images was a silhouette. Among all of those colourful portraits of famous faces, Mirian G. was an individual erased. I initially thought it was a power move, that this woman had chosen to trade in her identity for a larger cause. It turned out she was a Honduran asylum seeker, part of a class-action suit filed by the ACLU on behalf of families separated at the border, and that she had to be anonymous to protect herself. “In 2018, over 2,700 children were separated from their parents at the U.S.-Mexico border,” wrote Kumail Nanjiani. “Since that number is so unfathomably large, I think it is helpful to focus on one woman’s story.” In essence, the magazine found a way around the individual-as-icon, turning a spot for one into representation for many. It was a timely move.

It’s not that fame has become defunct — one study found that a number of millennials would literally trade their family for it — but celebrity isn’t the opiate it once was. Younger generations side-eye star endorsements, while online influencers, who affect the tone of friendly advice, have acquired monumental cache. (Though James Charles recently lost millions of YouTube subscribers following a very public fallout with fellow beauty vlogger Tati Westbrook, he still has more than 13 million.) It comes with a catch, though: Millennials will actually pay more for brands that are socially responsible. This aligns with the growing number of young activists, not to mention the U.S.’s youth voter turnout in 2018, the highest in a midterm election since 1982. As Williams concludes, “celebrity culture presents the human in commodity form, but it also consists of its opposite — the human can never be fully contained by the self-as-commodity, and the persistence of humanity is, in all circumstances, a cause for hope.”

While the citizen and consumer were once conflated, they now coexist, a separation that sometimes leads them to be at odds. The celebrity, the symbol of the latter, can in the same way clash with the former. In a context like this, Alyssa Milano’s ill-conceived sex strike, the latest case of a celebrity ham-fistedly endorsing feminist activism, is no longer simply swallowed in good faith. There is no good faith left, not even for our stars. They are symbols of an economy that consumes everything in its path, and struggling with them is part of a collective struggle with the inequitable, exploited world we live in, one in which each callout will hopefully add up to some semblance of change.

* * *

Soraya Roberts is a culture columnist at Longreads.

Technology Is as Biased as Its Makers

"Patty Ramge appears dejected as she looks at her Ford Pinto." Bettmann / Getty

Lizzie O’Shea | an excerpt adapted from Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Teach Us about Digital Technology | Verso | May 2019 | 30 minutes (8,211 words)

In the late spring of 1972, Lily Gray was driving her new Ford Pinto on a freeway in Los Angeles, and her thirteen-year-old neighbor, Richard Grimshaw, was in the passenger seat. The car stalled and was struck from behind at around 30 mph. The Pinto burst into flames, killing Gray and seriously injuring Grimshaw. He suffered permanent and disfiguring burns to his face and body, lost several fingers and required multiple surgeries.

Six years later, in Indiana, three teenaged girls died in a Ford Pinto that had been rammed from behind by a van. The body of the car reportedly collapsed “like an accordion,” trapping them inside. The fuel tank ruptured and ignited into a fireball.

Both incidents were the subject of legal proceedings, which now bookend the history of one of the greatest scandals in American consumer history. The claim, made in these cases and most famously in an exposé in Mother Jones by Mike Dowie in 1977, was that Ford had shown a callous recklessness for the lives of its customers. The weakness in the design of the Pinto — which made it susceptible to fuel leaks and hence fires — was known to the company. So too were the potential solutions to the problem. This included a number of possible design alterations, one of which was the insertion of a plastic buffer between the bumper and the fuel tank that would have cost around a dollar. For a variety of reasons, related to costs and the absence of rigorous safety regulations, Ford mass-produced the Pinto without the buffer.

Most galling, Dowie documented through internal memos how at one point the company prepared a cost-benefit analysis of the design process. Burn injuries and burn deaths were assigned a price ($67,000 and $200,000 respectively), and these prices were measured against the costs of implementing various options that could have improved the safety of the Pinto. It turned out to be a monumental miscalculation, but, that aside, the morality of this approach was what captured the public’s attention. “Ford knows the Pinto is a firetrap,” Dowie wrote, “yet it has paid out millions to settle damage suits out of court, and it is prepared to spend millions more lobbying against safety standards.” Read more…

Shelved: Tupac and MC Hammer’s Promising Collaboration

Illustration by Homestead

Tom Maxwell | Longreads | April 2019 | 14 minutes (2,898 words)

 

In 1990, rapper Stanley “MC Hammer” Burrell stood at the pinnacle of popular culture. His stage show featured 32 musicians and dancers, all of whom attended a rigorous boot camp. According to an Ebony magazine article from that year, the boot camp consisted of “four miles of jogging, weight training, and at least six hours of dancing daily.” “Hammer Time” cultural saturation included demonstrations of his athletic “Hammer Dance” on Oprah and appearances in commercials for British Knights athletic shoes and Pepsi. Hammer owned 2,000 pairs of baggy “Arabian pants,” which, along with gold lamé vests, made up his distinctive stage image.

Read more…

High Expectations: LSD, T.C. Boyle’s Women, and Me

Illustration by Homestead

Christine Ro | Longreads | May 2019 | 16 minutes (4,208 words)

I’m sweaty, exhausted, and red-faced when I finally emerge from my final acid trip. My apartment is a mess of objects my friends and I have tried feeling, smelling, or otherwise experiencing: loose dry pasta, drinks of every kind, hairbrushes, blankets. My voice is hoarse from talking or shouting all night. I’ve had more emotional cycles in the past 12 hours than in the last several months combined.

What made me want to drop acid wasn’t a friend or a festival, but a book. Specifically, T.C. Boyle’s new novel Outside Looking In. The book has its problems, but one thing it gets right is the intensely social experience of LSD. Even taken alone, even as a tool for introspective reflection, it rejigs attitudes towards other people. This can be a gift, or it can be a weapon. And as a woman, I’m especially aware of the potential for the latter. Read more…

If You Should Find Yourself in the Dark

Illustration by Wenting Li

Debbie Weingarten | Longreads | May 2019 | 14 minutes (3,460 words)

If your son cries in the night, begin a slow insistent hush. With your lips, make the sound of a snake. Even before you are fully awake, place your bare feet on the floor. Say, Mama is coming, and then creep past the purple glow of the nightlight to where he is a ball in his bed.

Lay your hand on his back.

If the covers have gone astray, or if his brother’s pinwheel feet are in his face, or if he has rolled onto the plastic toy he took to bed — fix it all. Place the covers back beneath his chin. Readjust the brother, put the toy on the shelf, kiss the forehead. Feel your way back through the darkness, over the sleeping dog.

***

Long ago, my parents were spelunkers. They would disappear into a hole in the ground, unsure of where the cave would lead, and pick their way along in the dark, their carbide lights illuminating the stalactites and stalagmites. They insist they felt excitement and possibility.

Once they brought my brother and me to a cave they remembered from college. It was supposed to be a family adventure. Together we would explore, and my parents would remember the way out.

What I recall is the surprising totality of darkness. And the terror I felt when we squeezed through the smallest of passageways. And the solidness — the unmoveableness — of the rock. If I breathed out or turned my shoulders in a certain way, I imagined I could be stuck there forever. If anything were to give, it would not be the rock; it would be my girl-sized bones.

Decades later, I still cannot relax into the dark.
Read more…