Search Results for: Science

‘I Don’t Think Those Feelings of Self-Doubt Ever Go Away.’

Heather Weston / Henry Holt

Amy Brady | Longreads | April 2019 | 10 minutes (2,627 words)

The truth has never been a universally agreed upon concept. As most psychologists will tell you, a shift in perspective can alter how a situation feels as well as what it means. And most historians agree that the “truth” of any significant event changes depending on who’s telling the story.

In her astounding fifth novel Trust Exercise, Susan Choi plays with both perspective and narrative structure to tell the truth, or “truth,” about a group of suburban performing arts high school students. The book begins with Sarah, a fifteen year old in deep lust with her peer, David. Their friends, Karen and Joelle, and outcast Manuel, round out the teenage cast. Martin is a theater teacher from England who spends a couple of weeks at the high school, and Mr. Kingsley is their beloved theater teacher who makes the students participate in trust exercises usually reserved for older, more experienced actors. His questionable teaching style and Martin’s over-familiarity with the students are clues that the adults view the teens as both children and grown-ups, as needing guidance to navigate the professional world of acting but as also already possessing the emotional development needed to withstand the cruelty it bestows upon them.

As the novel unfolds, Choi captures the rage and lust of teenage life with thrilling verisimilitude. Who hasn’t felt the devastation of unrequited love as a horny fifteen year old? Or felt mistreated in a friendship? Or held a secret from a parent? Choi’s descriptions of her characters’ psychological interiors are equally adept: The teens walk assuredly into a classroom one moment, only to feel crushed by self-doubt the next, their self-confidence ruled by roiling hormones.

The novel’s authenticity is what makes both of its structural shifts, when they arrive, so shocking; the lives of these teens feel too real to be anything but the truth. But after each shift, everything in the story that came before is changed — changed but not entirely undone. It’s as if we had been reading the novel through a telescope only to be handed a kaleidoscope to finish it; the story’s pieces are all still there, but now they are arranged in different and surprising ways.

The shifts bring revelations about what the students endured from their teachers and parents and each other. Some of the revelations are amusing in their familiarity. Others are heartbreaking for the same reason. Trust Exercise is a novel that resonates with the #MeToo movement, but it’s also a story as old as time — it’s about those in power taking advantage of those who are powerless to stop them. Read more…

Mystery Alaska

Getty / Photo illustration by Katie Kosma

Chris Outcalt | Longreads | March 2019 | 13 minutes (3,723 words)

The helicopter took off from a narrow patch of grass off the side of Route 2 about 30 miles southeast of Fairbanks, Alaska. The two-lane highway runs like an artery through the heart of the Alaskan interior, connecting the state’s third-most populous city to the outer reaches of North America. I’m riding shotgun in the lightweight, four-passenger chopper; Colorado State University (CSU) archeologist Julie Esdale is seated behind me. Esdale, who earned her Ph.D. in anthropology at Brown University, has spent more than a decade in this part of the state, exploring centuries of soil with a community of other social scientists whose aim is to weave together the tangled origins of humanity.

Fifty feet up, as the booming whop-whop of the propeller blades cuts through the air overhead, we crest a row of trees along the edge of the road, revealing a spectacular view: a massive, tree-lined valley framed to the west by the peaks of the Alaska Range, one of the highest stretches of mountains in the world. These jagged hills formed millions of years ago; shifting tectonic plates collided along the Denali and Hines Creek Faults, pushing the earth 20,000 feet into the air. Our destination lies about 10 miles into this lowland known as the Tanana Flats. Esdale and her colleagues believe the spot, a vestige of a 14,000-year-old hunter-gatherer encampment hidden deep in the earth, could hold important clues to better understanding the behavior of North America’s earliest inhabitants.

Esdale helped discover and excavate this important ground known as McDonald Creek, which turned out to be one of the oldest archeological sites in the country. Field crews found fragments of stone tools, charcoal dust left behind by ancient firepits, and remains of bison, mammoth, elk, and waterfowl. Admittedly, I hadn’t spent much time thinking about those who pioneered the landmass I’d lived on my entire life, let alone the particulars of their livelihood; but my interest piqued at the thought of these scientists dedicating their professional lives to better understanding those who came before us, like a detective unit attempting to solve one of the first mysteries of mankind.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Esdale, who’s in her mid-40s and has straight, shoulder-length blond hair she often tucks under a ball cap out in the field, explained that Alaska is a hot spot for this research — that it was both a matter of history and geography. The last ice age took hold about 2.6 million years ago. When it began to melt around 12,000 years ago, it covered a well-documented land bridge between what is now Russia and Alaska. But before the glaciers thawed, causing water levels in the Bering Strait to rise, submerging the area known as Beringia, early humans wandered east to west across this continental divide. They were the first people to set foot in the New World, and they walked straight into what is today central Alaska.

…my interest piqued at the thought of these scientists dedicating their professional lives to better understanding those who came before us, like a detective unit attempting to solve one of the first mysteries of mankind.

“Early sites are hit and miss in the lower forty-eight,” Esdale told me. “But in the interior, we’ve got lots and lots of them.” Still, perhaps too far-flung to have slipped into the mainstream, she said Alaskan archeology was often overlooked in favor of research in the continental United States. Esdale’s husband, Jeff Rasic, also an Alaskan archeologist, told me he’d attended numerous national meetings of top researchers in the field and had often been struck by how little they tracked new findings in Alaska. “These are full-time academic archeologists,” Rasic said, “and they’re behind.” If I ever wanted to have a look up close, Esdale said she’d be happy to show me around when I first contacted her by phone last year.

By chance, I flew into Fairbanks two days ahead of the summer solstice, which brings nearly 24 hours of daylight to the region. When I landed close to midnight the sky was bright enough it could’ve easily been noon. (Later, I overheard a popular American Legion baseball game was scheduled for the following night. First pitch: 12:01 a.m.) I met Esdale early the next morning. We stopped at the local Safeway for a coffee and to pack a lunch, then headed to the helicopter launch site. After about 15 minutes in the air, Esdale pointed to our landing spot, a prominent mound that jutted above the flat, wooded landscape.

As we approached, she explained the scenery would’ve looked a lot different 14,000 years ago; the ground was still recovering from the ice age’s deep freeze and the trees hadn’t grown in yet. Nevertheless, I could see what the people who camped here back then were thinking. Atop the high point of an otherwise flat area would’ve been a good place to lookout for predators, scout prey for their next meal, or to simply rest their legs and enjoy the view after a long walk. At least that last part, I thought, we had in common.

***

In Alaska, a state known for its expansive territory, the federal government is the largest landowner, controlling about 61 percent of the terrain. Most of that is allocated for public use and managed by the National Park Service and the Fish and Wildlife Service. There are other operators, however; notably, the United States Army oversees the use of about 1.5 million acres in the central part of the state.

Drawn to the open, undeveloped land and distinct climate, the military has maintained a presence in interior Alaska since the 1930s. Today, the local base is known as Fort Wainwright, “home of the Arctic Warriors.” During the frigid Alaskan winters, soldiers test gear, vehicles, and the limits of their own bodies in extreme cold. What’s more, with ample space, units can spread out and simulate wartime drills and construct practice bombing ranges. But although there are few neighbors to disturb, federal law — the National Historic Preservation Act and the Archeological Resources Protection Act — requires the military pay close attention to what might lie beneath the surface. In fact, given that the area is archaeologically rich, the Army funds a team of about half a dozen people who make sure it doesn’t trample any sensitive material — anything from stone tools or rock carvings to portions of structures or grave sites at least a century old. For the past eight years, Esdale has run the team.

Esdale first moved to Alaska in 2002 as a student, several years before getting the gig with the Army. She’d been conducting research for her Ph.D. in the far reaches of northwest Alaska when she met her husband out in the field. Not long after, Rasic got a job with the National Park Service based in Fairbanks; they made the move north together, two scientists in love headed for the Last Frontier. That first year they got a dog, a big, goofy lab who demanded a lot of time outside — even when it was 50 below and felt like your eyelids would freeze shut after a few minutes. Eventually, Esdale and Rasic had two boys and she got the contract with the Army. By then Fairbanks felt like home.

Although sharpshooting members of the armed forces and a crew of erudite scientists studying human history might seem like strange bedfellows, the partnership has identified hundreds of significant sites hidden in the Alaskan tundra. Take McDonald Creek, for example. Several years ago, the brass at Fort Wainwright proposed building a road through the Tanana Flats. A team headed by Colorado State’s Ned Gaines, which included Esdale, dug a few test pits while surveying in advance of the development. “Everywhere we put a shovel, we found artifacts,” Esdale said. The Army rerouted the planned road, and excavation of the site was turned over to Texas A&M researcher Kelly Graf.

Although sharpshooting members of the armed forces and a crew of pesky erudite scientists studying human history might seem like strange bedfellows, the partnership has identified hundreds of significant sites hidden in the Alaskan tundra.

I met Graf and her team of mostly graduate students last summer. From the clearing where our helicopter landed, Esdale and I walked a well-worn path to a sort of base camp — an area among the trees about 80 feet in diameter. The camp was surrounded by a small, pop-up electric fence designed to keep animals away, and there were dozens of water jugs and large plastic bear-proof storage containers that resembled beer kegs. About 10 people sat around in fold-out camping chairs and on tree stumps finishing their lunch. This was Graf’s fourth year digging at the remote location. One highlight, she said, was they’d recently found what appeared to be a bone from a dog. Graf said the discovery could amount to evidence of the earliest known domesticated canine in North America. While we were talking she wondered aloud whether these early people would have traversed Beringia via some sort of dogsled or used the animals to help shoulder the weight of their belongings.

After lunch, the group migrated to the nearby dig location, a large pit that looked as if someone had pressed a massive rectangular cookie cutter into the ground and discarded all the dirt in the middle. Excavating an archeological site is tedious work, a far cry from the escapades of the world’s most famous member of the trade, the fictional character Indiana Jones. Rather, it consists mainly of carefully scraping away layers of dirt with a trowel and cataloging any items for further examination and analysis. “Our goal as anthropologists — it’s not just about treasures, not just about finding stuff,” Esdale told me. “It’s to understand people.”

Scientists have learned a lot about the founding populations of Indigenous peoples who lived in this area, particularly about how they subsisted. These people were mobile, resourceful, and skilled — unquestionably successful big-game hunters who preyed on bison, elk, and maybe even mammoth. They used spears and a throwing device called an atlatl, a curved tool made from wood, bone, or ivory not unlike the plastic tennis ball throwers popular at dog parks today. Hunters used it to launch darts fashioned with a pointed stone tip. (The bow and arrow didn’t show up for another 12,000 years.) Flakes discarded during the sharpening of these points are often found in the soil at sites like McDonald Creek.

‘Our goal as anthropologists — it’s not just about treasures, not just about finding stuff,’ Esdale told me. ‘It’s to understand people.’

For her part, though, Graf hoped to find more than flakes. Carbon dating of charcoal left behind by campfires and preserved 10 feet underground suggested that people occupied this location three different times throughout history — 7,000, 13,000, and nearly 14,000 years ago — making it one of the oldest sites in Alaska. “It’s an interesting place,” Graf told me. “We’ve always been looking for the base camp of these people. There are a lot of hunting camps around, shorter-term sites, but somewhere they had to be hunkering down, where grandma and grandpa and the kids and the mom, where everyone was hanging out. That’s kind of what we’re wondering, because this is a nice, fixed spot.”

“So, this could be that type of place?” I asked.

“Could be,” she said. “Could be.”

***

On my second day in Fairbanks, Esdale introduced me to an archeologist in his mid-70s named Chuck Holmes. He had a full head of neatly parted gray hair and a trimmed white beard. Before we met, Esdale outlined Holmes’s long resume. He’d taught at multiple universities, enlightening undergrads and guiding Ph.D. candidates, and had held senior-level science jobs with both the state and federal governments. It all amounted to decades of research and discoveries in the region. Hearing Esdale, I got the impression she was describing a sort of grandfather of Alaskan archeology.

Holmes first came to Alaska via Florida, about as far away as you can get in the United States — a fact his mother made sure to note when Holmes told her he’d decided to enroll at the University of Alaska Fairbanks in 1970. Holmes had fallen for the state’s wide-open territory the year before. Thanks to a friend’s father who worked for one of the railroad companies, Holmes and his hometown pal landed summer jobs laying train track across the tundra. “My friend was a little less interested in doing that kind of work; I just saw it as an adventure,” Holmes said. “I got in good shape and got to see quite a bit of the state.” From that moment, aside from brief stopovers in Calgary, Canada, and Washington state, Holmes spent the rest of his life in Alaska.

Holmes told me that as a kid he’d always had a penchant for finding things, so it was perhaps no surprise that during his undergrad years in Fairbanks he found archeology. “I was hooked on Alaska at that point,” Holmes said. But it was something he discovered two decades later that Esdale wanted me to learn more about: another archeological site not too far from McDonald Creek. The spot was known as Swan Point, and it happened to be the oldest historical site with evidence of human activity not just in Alaska but in the rest of the United States as well.

Back then, in the early 1990s, Holmes worked for the Office of History and Archeology in Alaska’s Department of Natural Resources. One summer, he led a group of students digging at an already well-established site in the Tanana Valley. A couple of the kids involved in the excavation wanted to venture out to look for something new, so Holmes pulled out a couple of maps and a compass, essential tools for an archeologist in the days before Google Earth. He identified what looked like a promising topographic feature: a hill off in the woods that appeared high enough to function as a lookout point, but not so high that it would’ve deterred a group of hunter-gatherers from climbing to the top. Holmes told the students to check it out, dig a few holes, and see what they found.

On their first attempt, the kids had trouble pinpointing the right location. Holmes sent them back the next day with additional instructions, and this time they returned with wide grins. First, they handed Holmes a couple of small plastic bags containing flakes likely cleaved from a stone tool. Not bad, Holmes thought. That was enough to suggest the site was worthy of further exploration. The students, however, had one more bag to show off. This one contained a scrap of ivory. The hard, white material, typically part of a tooth or tusk, is much more difficult to find in the wild, particularly in a shallow test pit dug at a somewhat hastily selected point on a map. It was like plucking a needle you didn’t know existed from a haystack the size of Delaware.

Holmes and other researchers excavated Swan Point on and off for the next two decades. Carbon dating placed it at about 14,200 years old. Scientists uncovered all kinds of gems, including stone tools, bones from a baby mammoth, food-storage pits, and hearths that campfires were built upon. The findings from Swan Point have been documented and published in numerous scientific papers, and in 2008 the government listed the site on the National Register of Historic Places. As it turned out, Holmes explained, much of the Swan Point technology was similar to what had been commonly found by scientists on the other side of the land bridge in Siberia, suggesting these people were related in some way. “These guys, we’re not really sure who the heck they are,” Holmes said, referring to whomever camped at Swan Point so long ago.

“They’re basically Asian; they are ancient folk,” he said. “But their genes carried into the New World.”

***

Later that day, after meeting Holmes, Esdale and I bumped along an overgrown, two-lane Jeep road that ran deep into the woods. We were headed toward another archeological site on Army lands, this one dating back about 13,000 years. The road dead-ended at a clearing atop a ridge with a view of a river and an open forest below. Esdale explained this location, aptly named Delta River Overlook, marked the first time that archeologists had found a Beringian site that humans appeared to have occupied in the winter. They could tell, she said, based on the existence of a specific tooth that had belonged to a baby bison — a molar that only erupts in the cold season.

Winters were lean times for humans 13,000 years ago. In addition to tracking larger animals and storing the frozen meat under rocks, hunters in these tribes also set snares to trap small game for times when the weather made it challenging to venture too far from camp; at Delta River Overlook, for example, there’s evidence of grouse and ground squirrel. Staying warm was another challenge. Furs from big-game animals helped, but scientists are still piecing together the picture of what their shelters might’ve looked like that long ago. Best guess from ethnographic evidence, Esdale told me, is that families constructed dwellings by draping animal skins over a dome of flexible branches and packing the outside with snow for additional insulation.

The excavation of the Delta River site was led by a professor of archeology at the University of Alaska Fairbanks named Ben Potter. Potter was in China on a research trip when I visited Alaska, but I spoke with him on the phone later. Like Holmes, he’s made a number of important contributions to the Alaskan archeological canon. Potter’s body of work, however, contains one particularly unique entry: He uncovered the oldest human remains to date at an archeological site in Alaska. The first finding occurred in 2010, after years of work at an 11,500-year-old site known as Upward Sun River.

Potter and his team were contracted in 2005 to conduct a survey ahead of a proposed railway expansion through Army lands 40 miles from Fairbanks. His crew dug a few test pits and found evidence of human activity. The rail project was eventually rerouted, and in 2009 Potter received a grant from the National Science Foundation to continue excavating and investigating the site. He made the startling discovery the following year. About a meter down, Potter’s crew found parts of a human skull; later analysis determined the bones had come from a 3-year-old cremated child. In 2013, they went deeper into the site, and the team found the remains of two infants. Extracting human remains from the ground in Alaska necessitates consulting with local Indigenous tribes, which maintain a notable presence in their ancestral lands in the state — about 100,000 people spread across at least four groups. With the support and cooperation of local tribal leaders, his team removed the bones and sent out a sample for genetic analysis. They published the results last year.

The goal is just knowing more — to keep understanding.

The DNA makeup revealed an entirely new population of Native peoples, a group Potter labeled “Ancient Beringians.” There were other important findings at Upward Sun River. For example, they discovered fish bones buried in a hearth, where hunters would’ve cooked their meat, which helped Potter and his team establish the earliest known human consumption of salmon in the Americas. Previously, scientists had thought this occurred near the ocean. “It wasn’t on the coast, it was in the deep interior rivers,” Potter said. “That’s pretty exciting.” But the conclusions drawn from the DNA analysis were by far the most significant: a previously unknown branch of ancient humans.

It was a substantial addition to the archeology of the time. Although the general narrative about the early migration of people from Siberia to the Americas is mostly agreed upon, the specifics are subject to ongoing debate among social scientists. When exactly did these ancient people first arrive in Alaska? Did they settle down? If so, for how long? When did they colonize the rest of America? Did they travel inland or along the coast? What the DNA from Potter’s discovery and other analysis showed was that for a period of several thousand years the genetic code of early Indigenous people evolved in isolation, no longer mixing with the DNA of those who lived in eastern Asia. It also appeared that these Ancient Beringians were eventually separated from those who went on to colonize the rest of the Americas.

Two other groups of scientists have discovered new genetic evidence that he felt buttressed his work. The findings included, in part, a human DNA sample from a 12,600-year-old cave in Montana and a single tooth preserved from a 1949 dig at a 10,000-year-old site in western Alaska, hundreds of miles from Fairbanks. The tooth had long been forgotten, stashed away on a dusty shelf at a museum in Copenhagen, Denmark. It was found by, of all people, Esdale’s husband Rasic. Turned out, the genetic makeup of the tooth matched the children’s from Upward Sun River.

“This actually clarifies quite a bit,” Potter told me when I followed up with him after the new papers were released. He walked me through the scenario he saw taking shape: People were likely living in Asia around 16,000 years ago. The glaciers began to melt and tribes migrated from western Beringia to Alaska around, say, 15,000 years ago. Then you have a split: ancient Beringians sticking around Alaska and another group traveling south, either inland, along the coast, or both, entering the rest of the Americas. That second group, he said, looked to be a single population that spread quickly and later split into many lineages.

Talking with Potter about the DNA results and migration theories it reminded me of a conversation Esdale and I had on our drive out to Delta River Overlook, the day before I left Alaska and flew back to the rest of the United States. We’d been talking about how, based on the antique elements of the profession, archeologists are necessarily adept at spinning complex abstractions from limited evidence, whether it’s the shape of a microblade point or a scrap of an animal bone. It seemed to me, however, that that meant there was no endgame to this work — that it could go on forever, like trying to solve a massive jigsaw puzzle in which an untold number of pieces were destroyed eons ago. When I floated this thought to Esdale, she laughed. “Yeah, no, there’s never an endgame. The goal is just knowing more — to keep understanding.”

We continued along the Jeep road into the forest.

“I never really thought about it like that,” she said.

***

Chris Outcalt is a writer and editor based in Colorado.

Editor: Krista Stevens
Fact-checker: Samantha Schuyler
Copy editor: Jacob Gross

The Curious Tale of the Salish Sea Feet

Getty / Unsplash / Photo illustration by Katie Kosma

Kea Krause | LongreadsApril 2019| 16 minutes (3,905 words)

They come by way of similar discovery: A beachcomber, perhaps gathering shells or out for some exercise, spots a flashy, nonpelagic lump that, upon closer inspection, turns out to be a human foot still nestled in its shoe. The feet, both lefts and rights, come in all sizes — sometimes wearing New Balance or Nike, occasionally a hiking boot, and sometimes still attached to leg bones, a tibia sticking out like a stake in the ground.

To the intrigue and often horror of Pacific Northwesterners, in 2007 feet began washing up along the shores of the Salish Sea, an inland ocean spanning nearly 500 miles from Olympia, Washington, the state’s capitol, to Desolation Sound, in British Columbia, Canada. Today the tally is 21 feet and counting (15 in BC, six in Washington). So prevalent are the gruesome discoveries that the BC coroner’s office has a map marked up with each new find: Foot #1 — a right — found in August 2007 floated up to Jedediah Island in a generic white sneaker with navy blue accents; Foot #5 in a muddy Nike; Foot #13 wore black with Velcro. New Year’s Day 2019 delivered the most recent foot, number 21, to a beach in Everett. It tumbled ashore in an aging boot, its condition indicating it had been out to sea for “some time,” according to local police.

A pattern of body parts washing ashore has all the trappings of a serial killer scenario or a horror movie or, in the very least, of an otherworldly phenomenon. Earned or not, the Pacific Northwest has a haunting prestige — the home of Gary Ridgway, the Green River Killer, and Ted Bundy, and now also the land of Twilight’s Hollywood vampires in Forks, out on the peninsula. Some morbid element of the region has arrested our imaginations. It could be the skies: So gray and responsible for all the rain that keeps everything perennially damp. Or perhaps it’s the abundance of old-growth timber — plenty of dense and protected woods for stashing bodies. Rivers, branching across the state are another nature-made means of evidence disposal. It is rumored that Ridgway discarded the bodies of as many as 70 women around the Green River, 65 miles long descending from the Cascades and entering the Puget Sound just west of Seattle. In Washington State, geography and meteorology conspire to creep us out. But perhaps most lurid is the ocean itself, not just because it continues to spew body parts to its surface but also because of its infinite and perplexing nature. Its unknowability, though alluring to those in the script-writing business, has puzzled scientists and casual observers of the Sound for generations.

The southern portion of the Salish Sea is more familiarly known as Puget Sound, a body of water servicing the Seattle metropolitan area, home to about 3.8 million residents and plenty of industry — Amazon, Boeing, Microsoft, among others — all luxuriously settled in one of America’s most beautiful and diverse oceanic ecosystems. Seattle is rainy and weird, a place for artists and musicians to brood beneath weather-pregnant clouds, an offbeat city for both the creative and outdoorsy, resting in a hammock between two mountain ranges. But recently the area has seen changes out of its control: The tech industry is expected to expand the population of the Salish Sea region to 9 million people in the coming decades and has wiped away many of the city’s distinctive traits. The former home of Kurt Cobain and birthplace of grunge now has a median home value of more than $700,000 and mostly functions to accommodate well-compensated tech workers. It’s still weird though — after all, feet keep floating ashore.

A pattern of body parts washing ashore has all the trappings of a serial killer scenario or a horror movie or, in the very least, of an otherworldly phenomenon.

Last fall, I went looking for a foot. More specifically, I went to Crane’s Landing on Whidbey Island — a refuge in Puget Sound just north of Seattle — where a foot had been found, looking to see if the beach would tell me anything about why the sea had dropped the foot there. Off the ferry, I drove a narrow roadway so starved of sunshine that moss grew along its centerline. It wound through a collection of homes that petered out down by the water in a dead end. The pebble beach comprised of mostly smooth skipping stones, was lined with a row of ragged pilings, head-high with rotted bases, the remnants of the landing that had been the beach’s namesake.

When you’re from Seattle, it’s almost routine to be dazzled by the macabre sagas of the sea. As a child, my favorite story was one my uncle told about a body floating up behind his live-aboard sailboat on Lake Union. The idea of that bloated body floated into my imagination and from there on out, when visiting my family on their sailboat, I would keep my eyes glued to the water in the event another poor soul should bob up to the surface for my discovery. Read more…

The American Worth Ethic

Getty / Photo Illustration by Longreads

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)

“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.

The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.

Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

* * *

Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”

Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”

It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.

If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.

Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.

That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.

* * *

The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”

The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.

These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.

Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.

Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.

We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?

We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.

Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.

* * *

The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.

The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.

While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.

Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.

Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.

* * *

Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.

Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.

We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.

Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.

The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.

* * *

Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.

Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross   

When Zora and Langston Took a Road Trip

Library of Congress / Corbis Historical / Getty, Michael Ochs Archives / Getty

Yuval Taylor | An excerpt from Zora and Langston: A Story of Friendship and Betrayal | W. W. Norton & Company | March 2019 | 30 minutes (8,692 words)

 

Ornate and imposing, the century-old Gulf, Mobile and Ohio Passenger Terminal in downtown Mobile, Alabama, resembles a cross between a Venetian palace and a Spanish mission. Here, on St. Joseph Street, on July 23, 1927, one of the more fortuitous meetings in American literary history occurred, a chance incident that would seal the friendship of two of its most influential writers. “No sooner had I got off the train” from New Orleans, Langston wrote in The Big Sea, “than I ran into Zora Neale Hurston, walking intently down the main street. I didn’t know she was in the South [actually, he did, having received a letter from her in March, but he had no idea she was in Alabama], and she didn’t know I was either, so we were very glad to see each other.”

Zora was in town to interview Cudjo Lewis, purportedly the only person still living who had been born in Africa and enslaved in the United States. She then planned to drive back to New York, doing folklore research along the way. In late 1926, Franz Boas had recommended her to Carter Woodson, whose Association for the Study of Negro Life and History, together with Elsie Clews Parsons of the American Folklore Society, had decided to bankroll her to the tune of $1,400. With these funds, Zora had been gathering folklore in Florida all spring and summer. As the first Southern black to do this, her project was, even at this early stage, clearly of immense importance. It had, however, been frustrating. “I knew where the material was, all right,” she would later write. “But I went about asking, in carefully accented Barnardese, ‘Pardon me, but do you know any folk-tales or folk-songs?’ The men and women who had whole treasuries of material just seeping through their pores, looked at me and shook their heads. No, they had never heard of anything like that around there. Maybe it was over in the next county. Why didn’t I try over there?”

Langston, meanwhile, had been touring the South for months, penniless as usual, making some public appearances and doing his own research. He read his poems at commencement for Nashville’s Fisk University in June; he visited refugees from the Mississippi flood in Baton Rouge; he strolled the streets alone in New Orleans, ducking into voodoo shops; he took a United Fruit boat to Havana and back; and his next stop was to be the Tuskegee Institute in Alabama. It was his very first visit to the South.

When Zora invited him to join her expedition in her little old Nash coupe, nicknamed “Sassy Susie,” Langston happily accepted. (The car looked a lot like a Model T Ford, and could only seat two.) Langston adored the company of entertainers, and Zora was as entertaining as they came. Langston did not know how to drive, but Zora loved driving and didn’t mind a whit. They decided to make a real trip of it, “stopping on the way to pick up folk-songs, conjur [sic], and big old lies,” as Langston wrote. “Blind guitar players, conjur men, and former slaves were her quarry, small town jooks and plantation churches, her haunts. I knew it would be fun traveling with her. It was.” Read more…

Wonder Woman

Getty / Simon & Schuster

Mary Laura Philpott | I Miss You When I Blink | Atria Books | April 2019 | 10 minutes (2,808 words)

 
People blame their parents for their flaws and eccentricities all the time. In interviews, in therapy, in memoirs, they enumerate the many ways their mothers fucked them up. It seems we can’t discuss the way we are without assigning some responsibility to the generation before. Anyone can do it.
 
 
Chapel Hill, North Carolina. I was in first grade. My mother picked me up from school in our family Buick, as always. My dad, still in the early years of his medical career, was off working at the hospital most of the time, so the role of daily caretaker fell to her, as it did with most mothers then. She had been a schoolteacher before we were born—me, then my brother—and once she had us, she stayed home and we became her tiny class of two. When we were little, she was the one human being we saw most. She was our guide to how the world worked, not to mention our food source, our referee, our correctional officer, our chief entertainer—the de facto center of our universe.

That afternoon, I unloaded my Wonder Woman book bag onto the vinyl bench seat of our car and showed my mother the stack of papers we’d all been sent home with, a list of words printed on each page. Easy ones like love, candy, bike, and harder ones like breath, power, and understand. That week there was to be a spelling contest, winnowing the class down to the best spellers, ultimately crowning a champion.

Later that evening and every night that week, after my brother had been put to bed, she sat at one end of our green chenille sofa and I sat at the other as she called out two pages’ worth of words for me to spell aloud. I flailed around on the cushions, impatient, wanting to get down and read a book. “Why two?” I whined. “The teacher said one page a day.” My mom—in the same matter-of-fact tone she used for important edicts such as Stay out of the street; Eat your fruit; Go back and brush those teeth again, they’re still yellow—said, “Always do more than expected. That’s how you win.”

That’s how you win.

By the time the spelling bee started on Monday, I was ready. I moved on to the next round and did it all again on Tuesday, then Wednesday, then Thursday. When Friday came, sure enough, I clinched that spelling bee. I don’t remember if I got a medal or whether the other kids high-fived me, but I can vividly remember—as if she were standing in front of me right now—my mother’s beaming face. She raised her eyebrows and nodded as she broke into a smile. She was proud of me, and I was the Wonder Woman of spelling.

Had the term existed back then, my mom probably would have been deemed a tiger mother. She taught my brother and me to read when each of us were three, starting us out with Hop on Pop and Go, Dog, Go! In second grade, she offered me a Rubik’s Cube if I could ace my multiplication tables before the class deadline. In middle school, she woke us up every weekday at 5:45 a.m. to practice our piano. She never used cruelty—we weren’t chained in a cellar practicing fractions, although our protests may have sounded like we were. But through repeated practice, she made it clear that we were not fully prepared until we were overprepared, and that the desired goal, the only goal, was an A. Nobody makes a B in this house.

It was a simple rule—“work first, play later”—and it taught me that the natural order of things was to study hard, achieve your goal and receive the approval of your loved ones, and then (but not a minute before) relax.

We weren’t a family who held hands during the blessing or told each other we loved each other out loud, but the look on my mother’s face when I showed her an A+ said, “I love you.”

Good grades gave me evidence that, at least until the next test, I was secure in my place as a preferred person in my house and in my school and—probably, why not?—in the world. Naturally it stood to reason that the opposite was true as well. I remember the times I didn’t make good grades. There was a decimals test in fourth grade. After we got it back, everyone had to get it signed. I held it out to my mom, searching her face for a reaction as she put her signature on the page right next to the dreaded 80, feeling in my gut the absence of her smile. It was the absence of the ground beneath my feet. I may not have grasped decimals perfectly, but I could do this reverse calculation: If an A means You are loved and you belong here, then anything less than an A must mean You are not and you don’t.

When you internalize what you believe to be someone else’s opinion of you, it becomes your opinion of you.

I came to rely on grades for my regular jolt of self-esteem. It’s a miracle I didn’t end up with a back injury from bringing all my books home every night in case I realized I needed to complete an extra assignment in something. It became my routine, one that lasted well past middle school into high school and even college, long after the days of bringing grades home for a signature: Study my ass off, panic that my run of luck was over and I’d fail, then get my grades back. The validation would rush to my head, a perfect high. Each hit set chaos into order. Every check mark, every gold star, confirmed it: I succeed, therefore I am.
 
 
Perhaps this is why misspelled words cause me a disproportionate amount of rage to this day. When I see mischievous spelled mischeivious I don’t just think, Hey, that’s wrong, I think, WHERE IS THAT WRITER’S SELF-RESPECT? Somewhere inside my brain, first-grade-me is also wondering, aghast, Don’t you want to be loved?

I had a freelance editing client years ago, a CEO who’d been at her job for decades. She refused to accept my edits whenever I removed the double spaces she placed after periods at the ends of sentences. Again and again, I’d strip out the extra spaces and send her documents back with single spaces, and she’d add the spaces back in. That was what she ’d learned in school, she insisted. I’d get purple in the face explaining that, yes, double spaces were required back in the day when everyone used typewriters but that modern word-processing programs had rendered obsolete the manual widening of the space between sentences. One space was the new rule. “Don’t you want to be right?” I’d say, exasperated. “I am right,” she ’d say. Maybe we were too much alike, an impossible match.
 
 
I worry that my kids will inherit my worst traits, that they’ll turn out too much like me, fixated on racing to the finish line with a perfect score. So when they walk through the door in the afternoons these days, I ask them what they had for lunch. I don’t actually care what they ate. I mean, I do—I’m their mother, so of course I’m concerned that they’re working their way around the food pyramid or the food train or whatever it is now. The lunch question is about something else.

We’re all a little weird thanks to our mothers. I’m carrying that tradition on with my own children.

I’d be thrilled if my kids made the dean’s list, and you better believe I make them learn those extra spelling words. But I also want my daughter to try a risky science experiment, and when it goes differently than expected, I want her to shrug it off and try another one. I want my son to bring home paintings and clay sculptures he’s proud of because they’re beautiful in his own eyes, not because they got him a good grade.

So I don’t ask them about their grades the minute they come home. Silently, I give myself an A+ for this move. I award myself an invisible certificate of achievement for parenting excellence, with high honors in nurturing a value system that emphasizes effort and curiosity over quantification. I do that because over in a little corner of my head, six-year-old-me sits on a big green sofa, clutching her spelling pages, wanting desperately to hear, Good job. She never left; she ’ll never leave. It’s too late for her, but not for them. They can be better than I am.

Maybe they’ll grow up to have a strange obsession with lunch, and blame me.
 
 
So there you have it.

When I was growing up, my mother was a hard-ass, and she turned me compulsive.

It’s all my mother’s fault.

* * *

Or:

When I was growing up, my mother was my cheerleader, and she made me successful.

It’s all to my mother’s credit.
 
 
Chapel Hill. First grade. My mom picked me up from school. Left to my own devices, I might have crammed those spelling pages back to the bottom of my book bag with the empty, peanut-butter-smeared sandwich baggies and the balled-up sweatshirt I hadn’t worn in a month.

But my mother intervened and changed everything. She had seen how quickly I took to books, how I’d sit and read, focusing until I got to the end of a story. She had noticed how naturally I recalled a word once I’d seen it a single time. She saw potential I could not have seen in myself at that age. She reached for that stack of spelling words.

And so my brother was sent to bed while I was allowed to stay up. I got to snuggle into the nubby pillows of the green sofa next to my mom as I learned tricks for training my brain to hold as much as it could. I found that if you spell a word out loud five times in a row, the sixth time is a snap.

“Hair. H-a-i-r. Hair,” I spelled.

“Yes!” she cried.

I started spelling words in conversation: “I’m going o-u-t-s-i-d-e now.” “Do I have to wash my f-a-c-e tonight?” My mother showed me how to bump up against what felt like the natural limits of my mind and then keep pushing into the territory that lay beyond.

When I won that spelling bee, I got a smile from my mom that no one else got. This wasn’t just regular love like all kids got from their parents. This was extra love, something more, just for me. It filled me up, and I would never again settle for anything less.

When I held out my math test with a B on it, she didn’t reward me with a smile, because she believed I could have made an A. In time, I believed I could make A’s, too. She held me to the standards she knew I could meet. As if running alongside my bike with a hand on my seat, then letting go, she guided me until I could excel on my own.

My work ethic helped me earn my way into opportunities that changed my life: contests, college, jobs, assignments. I became a person other people can count on, someone they trust to do a good job. I grew to think of myself this way, as a helpful person, a reliable person.

My mother the wonder woman made me a wonder woman, too.

* * *

Even small events can have a formative effect on our lives. Everything sinks into the soil.

That’s how I think of that first-grade spelling bee. Did it really change me from one kind of person into another? I suspect it was less a cause of my perfectionism than simply the first manifestation of it, but I remember it as a before-and-after marker on my timeline. My best guess is that something within me, some strand of DNA, was extra susceptible to the idea of quantifiable self-worth, and school was the perfect environment for it to thrive. (Seriously: a spelling bee for first graders? The 1980s were hard-core.) Plenty of other kids had strict parents, too, but they didn’t all become obsessive about grades. My brother grew up right alongside me, but when he got a B, he just went into his room and played his Bon Jovi tapes. Big deal.

Of all the genes parents pass down and values they instill, how does one take hold so much stronger than the others? How do two kids with the same genetic ingredients and upbringing turn into such different people? My brother became a high-achieving student, too, but also a sneaky, laid-back teenager, the kid who hid beer in our backyard tree house and laughed it off when he got caught. I became uptight and anxious, the one who religiously performed all three steps of the Clinique three-step cleansing system every night because the instructions said, Wash, tone, moisturize. He stood right next to me when my mother said, “Practice your piano for thirty minutes each while I’m at the grocery store.” So why did I slog through thirty minutes of Beethoven every time and then watch in fuming rage as he played video games? Does it even matter why?

It filled me up, and I would never again settle for anything less.

There’s not much I’d blame any parent for, honestly, now that I am one. Cruelty, neglect, abuse—absolutely—but word-drilling on the green sofa? No. We’re all a little weird thanks to our mothers. I’m carrying that tradition on with my own children.

What a job, to raise someone from birth to adulthood, bestowing upon them your knowledge and your values and, despite your best intentions, any number of traits you’ve inherited yourself. What a loaded task, to make every move, every day, in such a way that the impressionable larva-person in your home will see your example, process it into something within themselves, and grow layers of muscle and soul over it until she is a fully developed human being. And all the while, the little person you’re nurturing is fighting you—spitting out the broccoli, not wearing the helmet, rolling her eyes at your carefully chosen words of advice—–and you become constantly worn down even as you pour your energies into loving her.

My mom gave me all the tools she had, some of which I couldn’t use. She grew up to be a plant whisperer after helping her dad tend his garden in the wild green lot behind their little house outside Birmingham, Alabama, and she tried to teach me to be one, too. I used to follow her around our backyard, watching her reach into a mass of stems and leaves with her clippers and snip this bloom or that one to toss into her basket; then I’d sit mesmerized as she stuck them into vases and bowls, creating what looked like tabletop parade floats. She ’d coach me to do the same—“Here, put some greenery in, make it look softer”—and I’d stab a branch into the bunch, ruining the loose beauty of her arrangement. You point to anything with roots, and she can name it, arrange it, and/or cook it, and I can’t keep a pot of basil alive for longer than a week. Why didn’t that stick?

What did stick—whether she intended to pass it along or not—was her sense of humor. When it came to academics, my mom may have been a warlord zipped into the body of Sally Field, but the rest of the time, she cracked us up. Whenever a Little Richard song came on the car radio, she would bust a move at the wheel like a one-woman episode of Dance Fever. She let me play beauty salon and make dozens of tiny pigtails all over her head with my colorful plastic barrettes. When I was bothered by the fact that none of my Barbies had underwear, she sewed a complete trousseau of tiny lingerie. Like her, I love little visual absurdities (ah, the inherent hilarity of a teeny-weeny doll bra), dry one-liners and well-timed cracks, and perfectly executed, utterly insane mishmashes of curse words. (My mom, upon walking into a messy room: “It looks like the ass end of destruction in here.” The ass end of destruction!)

When I was seventeen, I might have told you I was a neurotic student because my mom was so tough about grades. When I was twenty-five, I might have shrugged and said, eh, maybe it was my mom who made me a control freak or maybe I’m just me, who knows. By the time I reached my thirties and had my own children, I knew perfect parenting was a myth, and I understood that while she was responsible for making me, she couldn’t have known how I’d end up made. No one could have. That’s a little mystery we all unfurl on our own.

* * *

 

“Wonder Woman” is an excerpt from the book I Miss You When I Blink © 2019 by Mary Laura Philpott, published by Atria Books on April 2, 2019.

Buy the book

 

Editor: Cheri Lucas Rowlands

‘Craft Is My Belief System. My Obligation To Writing Is Religious.’

RASimon / Getty

Lily Meyer | Longreads | March 2019 | 9 minutes (2,302 words)

 

Nathan Englander has been writing fiction about Jews in America for nearly as long as I’ve been a Jew in America. I stole my mother’s copy of his debut collection, For the Relief of Unbearable Urges, as a nine-year-old, and was both enthralled and baffled by his stories of Orthodox identity and longing.

Since then, Englander has written a play, another story collection, and three novels, the most recent of which, kaddish.com, opens with a secular Jew named Larry who refuses to say daily Kaddish for his dead father. Saying Kaddish is, according to Jewish law, the eldest son’s duty, but Larry can’t bring himself to return to the synagogue he has left behind. Instead, he finds a solution on the then-new Internet: he’ll pay a rabbinical student in Jerusalem to take on his filial duty. Years later, Larry returns to Orthodox Judaism, reinventing himself as a yeshiva teacher named Reb Shuli. He’s happily married, and comfortable in his reclaimed community. The sole stain on his Jewish life is his failure to say Kaddish for his father. His guilt swells into an obsession, and soon, he’s off to Israel to track down the proprietor of kaddish.com and get back the birthright he e-signed away.

Englander tells Shuli’s story in the language Shuli knows best: The Yiddish-inflected, Hebrew-sprinkled English of religious American Jews. He writes with humor, pathos, and irrepressible life. I thought often of Grace Paley as I read kaddish.com, and of the Coen brothers’ movie A Serious Man, which, as it turns out, Englander loves. We spoke on the phone about the Coen brothers, Philip Roth’s secular funeral, and other questions of Jewish-American identity. Like my nine-year-old self, I was enthralled. Read more…

Namwali Serpell on Doing the Responsible Thing — Writing an Irresponsible Novel

Peg Skorpinski / Hogarth

Tobias Carroll | Longreads | March 2019 | 18 minutes (4,830 words)

Namwali Serpell’s first novel, The Old Drift, tells the story of several families living in Zambia, encompassing over a century of their interwoven lives. The novel takes its title from a region located near Victoria Falls (otherwise known as Mosi-o-Tunya, which translates to “The Smoke That Thunders”), which is also where the novel begins. Along the way, The Old Drift touches on many moments in history, from the Second World War to Zambia’s foray into space exploration.

But Serpell isn’t content to simply tell the story of a nation through several generations of its residents. Instead, her narrative extends into the near future, and each of its sections is paired with a short passage written by a strange collective voice — one which doesn’t seem to be human. It’s a bold narrative choice, but it’s one that pays off brilliantly at novel’s end.

Serpell’s bibliography covers a broad range of styles and territories, from the theoretical to the metafictional. Her first book, Seven Modes of Uncertainty, explored the works of writers like Tom McCarthy, Toni Morrison, and Ian McEwan. She’s contributed the introduction to Penguin Classics’ edition of Ngũgĩ wa Thiong’o’s novel Devil on the Cross. And her short story “Company,” published in the “Cover Stories” issue of McSweeney’s, reimagines a Samuel Beckett narrative along Afrofuturist lines — a process that Serpell described in one interview as “a Janelle Monaé cover of a Philip Glass song.” Read more…

Twitter Won’t Miss You: A Digital Detox Reading List (and Roadmap)

Follow the crowds to a world with less screen time. (Photo by davity dave via Flickr, CC BY-SA 2.0)

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories. 

Have you ever needed a break, but just not known from what? Everything seems fine…ish. Your job is OK, your friendships are all right, your health is decent, nothing dramatic to report. And yet, you’re stressed. Dissatisfied. Bored. Sometimes you even feel exhausted and overwhelmed. Maybe you should distract yourself by looking at Instagram. Maybe you should find someone with whom to argue on Twitter. Maybe you should see what your ex is up to on Snapchat.

Or maybe you should get the hell off social media for awhile.

At least, that’s the prescription issued by an increasingly vocal crowd of psychiatrists, psychologists, sociologists, writers, philosophers, performers, and general opinion-havers. The common term is “digital detox,” whereby an individual commits to a cessation of specific actions on one’s Internet-enabled devices for a finite period of time. One can go on this adventure with friends, family, or a likeminded group of strangers from, you guessed it, the internet.

I’ve been an enthusiastic and sometimes addicted social media user since approximately 2003. But after beginning my research for this column, I went on a digital detox of my own. It is small and manageable, and nothing so impressive as author Cal Newport’s suggested 30-day detox from all nonessential online functions. But it has improved my life already in measurable ways. Here are some writers whose approaches to their own vacations from the Matrix helped me shape mine.

1. “Unplugged: What I Learned By Logging Off and Reading 12 Books in a Week.” (Lois Beckett, The Guardian, December 2018)

Beckett nabbed what must’ve been the plum journalistic gig of the year: head to a tiny cabin in the foothills of the Sierra Nevada, and read. Books. Made of paper. “This was a perfect assignment,” she writes. “For journalists on many beats — including mine, which includes the far right and gun policy — it had been a year of escalating violence during which conspiracy theories had moved into the mainstream.” And off she went, blissfully unencumbered by wifi. She brought a stack critically acclaimed books purchased at different independent bookshops and a plan was to read 30 books in a week, a number that sounds patently insane to me. She read 12. I’m still impressed — and envious.

The ensuing story is littered with gentle shade, which I always appreciate, and she’s a damn good writer: “I was not going to finish all 30 books at any cost, skimming to the right section of the right chapter in order to say one smart thing — in the U.S., we call this skill a ‘liberal arts education’ — but instead wanted the books’ authors and their protagonists to collide and argue with each other, to give me some different understanding of what had happened in 2018.”

2. “#Unplug: Baratunde Thurston Left The Internet For 25 Days, And You Should, Too.” (Baratunde R. Thurston, Fast Company, June 2013)

I adore my longtime friend Baratunde, though perhaps not as much as my mother, who has met the man twice and still has a copy of his 2013 Fast Company cover story somewhere in her house. He’s a great human.

And now that we’ve established my utter lack of objectivity, let’s hear from his 2013 self: “I’m an author, consultant, speechifier, and cross-platform opiner on the digital life. My friends say I’m the most connected man in the world. And in 2012, I lived like a man running for president of the United States, planet Earth, and the Internet all at once.” That very accurate description is exactly why it was so interesting that Baratunde Rafiq Thurston, of all freaking people, did a digital detox.

At the time, I remember worrying that he might burn out or possibly just suddenly up and die due to lack of sleep, so it was clearly a good move. I can’t imagine replicating what he did (no email?!), but since he was self-employed with a personal assistant and has an incredible amount of willpower, he was able to pull it off. His nine-point digital detox preparation checklist is incredibly helpful, and I intend to use it the next time I do one. My favorite line? “She transmitted this data by writing down the names on a piece of paper.” And yes, he was happier and healthier by the end of the experience. To this day, he goes on regular social media vacations, and I believe he’d tell you his life is better for it.

3. “Quit Social Media. Your Career May Depend On It.” (Cal Newport, New York Times, November 2016)

“I’m a millennial computer scientist who also writes books and runs a blog,” Newport writes. “Demographically speaking I should be a heavy social media user, but that is not the case. I’ve never had a social media account.” Newport lays out in plain, accessible language the notion that social media distracts from good work because it is designed to be addictive. It’s a notion with which I agree, based in no small part on my own lived experience; I have no doubt my writing output has suffered as I’ve devoted more and more time to social media. As Newport writes, “It diverts your time and attention away from producing work that matters and toward convincing the world that you matter.”

4. “Cal Newport on Why We’ll Look Back at Our Smartphones Like Cigarettes.” (Clay Skipper, GQ, January 2019)

Fast forward two and a half years. Newport, by now an in-demand speaker and author of two books — the latest is Digital Minimalism: Choosing a Focused Life in a Noisy World — expands on his November 2016 Op-Ed. Newport is a reluctant self-help guru who would undoubtedly reject that label. In this interview (as in the one I heard with him on fellow PoB (Pal of Baratunde) Lewis Howes’s podcast “The School of Greatness”), Newport stresses that he doesn’t typically offer a program or prescription. However, his recommendation for a 30-day digital detox seems simple in concept and necessarily jarring to execute: one dispenses with all digital products that are unnecessary to one’s career and personal health. Check your work email and log into your bank app to ensure a direct deposit has gone through, but let Facebook, Twitter, and Instagram accounts lie fallow for 30 days. Skipper is an able interviewer and Newport is a clear, experienced, and intelligent interviewee.

5. I Quit Social Media for 65 Weeks. This Is What I Learned. (Kareem Yasin, Healthline, February 2018)

Yasin interviews David Mohammadi, who left social media for over a year and loved the experience. A newly minted New Yorker, he abandoned the online pseudo-friendship industrial complex because he was worried he’d obsess over what was happening back in San Francisco. And he had good reason to suspect he’d be homesick — he’d tried the East Coast thing once, been endlessly captivated by his Bay Area friends’ Facebook updates, and ended up moving back to San Francisco. Years later, a more mature Mohammadi quit his job and decided to start a new career in New York with a clear mind unclouded by social media-induced FOMO. You likely won’t be surprised to hear his take: “The first week was hard. The second week was nice. And as I got closer to the end date, I just was like: ‘Wow. It feels great to be so present, and not just on my phone.’” But the benefits didn’t just extend to mental health — he made more money, too! Yasin writes, “Working as a boutique manager, [Mohammadi] noticed how his coworkers would constantly check their phones. Those two-minute breaks from the real world robbed them of opportunities to get more commissions — opportunities that would be theirs if they would just look up and notice the customers.”

* * *

Like you, probably, I have a personal Instagram account. Except it isn’t personal, really — with 14,200 followers, it is ostensibly a way to cultivate and grow an online brand based on me, myself, and I. I write essays and books; I do comedy shows; I lecture on mental health awareness at colleges; I pop up as a talking head in various capacities in various venues. Like you, probably, I want to be seen as an attractive person, so sometimes I use filters or put on more makeup than is absolutely necessary for a selfie. Like you, probably, I want to be seen as a capable person worthy of being hired, so I do my best to seem witty and fun but chill, man. Given that I want to write more for television and that a lot of my work falls under the category of “entertainment,” I have followed the conventional thinking in my industry, which boils down to “Always be selling (yourself).”

This thinking extends to my “personal” Twitter account (77,400 followers), despite my many qualms about the ethics of its overseers with regard to threats and harassment. It extended to my Facebook fan page, until I quit Facebook altogether because I don’t care what my least-favorite racist relative ate for breakfast — if I want to know what’s up with a boring person from high school, I’ll make private inquiries. When the current Russian government really loves something, I have to ask myself if I need that something in my life. (Note: I am aware that Facebook owns Instagram, and that I’m a hypocrite sometimes.)

Then there’s the Instagram account for my podcast (679 followers) and the Twitter account for my podcast (457 followers) and the Instagram account for my progressive lady-coat art project (26,200 followers). I don’t use Snapchat, because once I joined for 24 hours and my drunk friend sent me a dick pic framed by monogrammed his-and-hers towels in the master bathroom he shares with his girlfriend; I’m a Scorpio, and pseudoscience and common sense immediately told me the power of the Snap was too great for my personal constitution to handle. I also recently joined a few dating apps. And that led to more swiping, more clicking, more texting, more aggravation of writing-induced carpal tunnel issues. When an ex-NFL star asked me on what I’m sure would have been a super safe and not-gross date to his house at 3 a.m., I decided that Tinder was also too much for me.

At this point, and considering my sore wrists, the signals seemed to say, “SARA. TAKE SOME TIME OFF THE SOCIAL MEDIA.” I had 104,000 followers across social media, some of whom were double or triple followers and some of whom were robots, and while I loved each of them like my very own imaginary baby, Mommy needed a vacay.

First, I enabled the Screen Time function on my phone and discovered that I use it, on average, over seven hours a day. This horrifying fact led me to design the parameters of my moderate digital detox: I’d continue to use my email for work and social reasons. I would continue to use Twitter, but only to share my work or the work of a friend or charity. I would post a note announcing that I was taking an Instagram break until April 9, the day the second season of my podcast debuts, to give both a heads up to any former professional athletes that I wouldn’t be interacting with them there and to announce the premiere date. I would text when I felt like it, but leave my phone facing down when I wasn’t using it. I would remove Instagram from my phone, just as I’d done with Twitter months prior. At night and during my daily meditation practice, I would put the phone on airplane mode.

Following those simple rules, and only occasionally breaking them, I managed to reduce my phone time by 10 percent in the first week. I resumed the regular at-home yoga practice I’d attempted a month prior. I finished the outline of an hour-long TV drama pilot. I went on actual face-to-face dates with humans during daylight and appropriate evening hours. I visited with two friends. I got the “annual” physical I’d put off for two years. And I wrote this column.

While I intend to resume using Instagram on April 9, I will do as Cal Newport recommends: use social media like a professional, for specific purposes, and do not stray from said purposes. Twitter and Instagram will remain places for me to share my work and the work of friends and charities I admire. Sometimes, I will use these places to discover great writing, music, and more. Moving forward, I want to reduce my screen time by 10 percent each week until I average under four hours per day on my phone — and then I’ll try to reduce it even more.

I’m pleased with my progress. It may seem meager, but it’s a start. And I feel better already. So if you’ve considered quitting social media but have some qualms, do what I did: start small. Pop your head above the churning surface of our wild, untrammeled internet, and take a look around. Stay awhile. Your eyes will grow accustomed to real sunlight soon enough, and it’ll be easier to breathe. It’s pretty nice up here.

* * *

Sara Benincasa is a stand-up comedian, actress, college speaker on mental health awareness, and the author of Real Artists Have Day JobsDC TripGreat, and Agorafabulous!: Dispatches From My Bedroom. She also wrote a very silly joke book called Tim Kaine Is Your Nice Dad. Recent roles include “Corporate” on Comedy Central, “Bill Nye Saves The World” on Netflix, “The Jim Gaffigan Show” on TVLand and critically-acclaimed short film “The Focus Group,” which she also wrote. She also hosts the podcast “Where Ya From?”

Editor: Michelle Weber

Queens of Infamy: Josephine Bonaparte, from Martinique to Merveilleuse

Illustration by Louise Pomeroy

Anne Thériault | Longreads | March 2019 | 22 minutes (5,569 words)

From the notorious to the half-forgotten, Queens of Infamy, a Longreads series by Anne Thériault, focuses on badass world-historical women of centuries past.

* * *

Looking for a Queens of Infamy t-shirt or tote bag? Choose yours here.

In 1768, a 15-year-old girl traveled to the hills near her family home in Martinique to visit a local wise woman. Desperately curious to know what her future held, the girl handed a few coins to the Afro-Caribbean obeah, Euphémie David, in exchange for a palm reading. Euphémie obligingly delivered an impressive-sounding prediction: the girl would marry twice — first, unhappily, to a family connection in France, and later to a “dark man of little fortune.” This second husband would achieve undreamed of glory and triumph, rendering her “greater than a queen.” But before the girl had time to gloat over her thrilling fate, Euphémie delivered a parting blow: in spite of her incredible success, the girl would die miserable, filled with regret, pining for the “easy, pleasant life” of her childhood. This prophecy would stay with the girl for the rest of her life, and she would think of it often — sometimes with fervent hope, sometimes with despair, always with unwavering belief that it would come true.

That girl was the future Empress Josephine Bonaparte. Everything Euphémie predicted would come to pass, but young Josephine could not have imagined the events that would propel her to her zenith: the rise through Paris society, the cataclysm of the French Revolution, the brutal imprisonment during the Reign of Terror, the transformation into an infamous Merveilleuse, the pivotal dinner at her lover’s house where she would meet her second husband.

She wouldn’t even have recognized the name Josephine — that sobriquet would be bestowed by Napoleon some 18 years hence. The wide-eyed teenager who asked Euphémie to tell her fortune still went by her childhood nickname, Yeyette.

Read more…