Search Results for: Inc.

Lumbersexuality, a Sport and a Pastime

Illustration by Homestead

Jonny Diamond | LongreadsJune 2019 | 22 minutes (5,308 words)

The sound is the first thing you notice, deep and hollow, burnished steel hitting chewed-up white pine. It’s not quite the warm, resonant thok of an axe in the woods, but the nearest forest of any significance is 50 miles up the Hudson River. This is Brooklyn, one very long bow shot from the Gowanus Canal.

It’s a chilly Monday night before Thanksgiving and Kick Axe Brooklyn is surprisingly full. Around two dozen people cluster in groups of six or eight across several “ranges,” tidily built versions of the old roadhouse bar-band cages, target at one end, party at the other. There doesn’t appear to be any flannel in the crowd (for now) but there are at least three reasonably grown-out beards in plain sight. One of the beards puts his beer down next to a basket of plastic Viking helmets and walks forward to pick up an axe from a squat round block of maple (each range has one of these blocks, to which the axe is returned after it is declawed from the wood).

Nobody pays much attention as he squares himself to the softwood target 16 feet away, holding the axe — specifically, an Estwing hatchet weighing about a pound and a half — with both hands and raises it above his head. Then, in a surprisingly fluid motion, he steps toward a faded red line on the floor and releases the hatchet in the direction of several concentric red and black circles painted on the wood, axe head over handle, where it strikes fast about six inches to the left of the bull’s-eye. He shakes his head, pulls the axe from the wood, and goes to collect his beer.

Scenes like this occur with increasing frequency in cities across North America, from Toronto to Austin to L.A., as axe-throwing clubs attempt to create their own niche and fill it, something like a laidback millennial bowling alley except with deadly weapons. For some, particularly since the election of Donald Trump, the physicality and latent violence of axe throwing has served a therapeutic purpose. As Megan Stielstra wrote in an essay last year for The Believer, “I threw axes throughout the fall, waking up every morning to new impossible cruelties. … I kept trying to pass the axe to my husband, but he wouldn’t take it. ‘You need it more than I do,’ he said from behind the yellow spectator line.”

Aside from its salubrious value the basic appeal of axe throwing is not complicated: Like bowling or billiards or darts, it is a way to give loose structure to any given social gathering. When I ask Kick Axe’s Nathan Oerstler if he’s ever had to deal with any drama among the beer-drinking axe throwers, the recently promoted “axe master” (up from “axe-pert” — there is no pun left unmade at Kick Axe, as the name might suggest) demurs, explaining that most of the axe-perts are comedians or actors — theater types — and serve as much as entertainers as they do instructors or referees: in short, they keep the people happy. Kick Axe opened in December 2017 and is more flannel-inflected theme park than bar, its employees communicating via headset about what targets need replacing, which axes need sharpening. This level of organization makes sense when you consider the hundreds of pounds of deadly steel flying through the air at any given moment, but axe throwing wasn’t always this professionalized: In fact, the origin of the axe-throwing social club is basically a bunch of bored Canadians in the mid aughts, standing around drinking beer and chucking hatchets at backyard waste wood.

As Backyard Axe Throwing League (BATL) founder — and one of those bored Canadians — Matt Wilson recounted, people kept showing up to throw axes in his backyard, so he had no choice but to grow. And so they did: The BATL, which has 10 locations in Canada, has since expanded into the U.S. with spots in Chicago, Nashville, Scottsdale, Houston, and Detroit. This unlikely success story has spawned competitors: Ontario’s Bad Axe now has 15 locations across the U.S.; the aforementioned Kick Axe also has locations in Philadelphia and Washington, D.C., and is opening more in Florida and Texas; and there are at least a half dozen independent axe-throwing venues across the country (including Massachusetts’s Half Axe, whose name heralds the end of the useful axe pun, or at least demarcates its nadir).

Whatever side of the border these clubs are on, most of them affect a shaggy, woodsy aesthetic, a little plaid here, some taxidermied animal there. One could say the same thing of many of their patrons, from Calgary to Orlando: red-and-black Buffalo check accenting high-cut oxblood Red Wings; gray chambray tucked into vintage denim; Carhartt jackets over Carhartt vests over old Woolworth’s shirts.

Most of the axe-perts are comedians or actors — theater types — and serve as much as entertainers as they do instructors or referees: in short, they keep the people happy.

This aesthetic — lumbersexual, which entered the mainstream vernacular in 2014, at a site called GearJunkie, and was just as quickly derided on Gawker and in The Atlantic — is certainly not limited to axe-throwing clubs (one could make the case that axe throwing as a pastime has arisen, inevitably, from the aesthetic). But as a loose set of fashion signifiers, lumbersexuality has been around in some form or another for a generation, competing with any number of the self-consciously vintage looks manifested in hipster culture.

As with so many of the aesthetic strands that make up any given tangle of contemporary style-consciousness, lumbersexuality’s origins can be found on the margins, one more example of straight culture borrowing heavily from gay culture, with half the commitment and none of the risk. Beards and bears and woodsy scruff have now fully entered the mainstream as the contemporary lumbersexual reappropriates the same tropes of classic American masculinity so long adopted and amplified in LGBTQ spaces. But even the original tropes themselves — of paternal strength and rugged stoicism — are products of male fragility.

As Willa Brown points out in the perfectly titled article “Lumbersexuality and Its Discontents,” the endless talk in the past decade of a crisis of masculinity is part of a long tradition in the patriarchal American imagination. In Brown’s oft-cited 2014 account for The Atlantic, the nostalgia-ridden aesthetic of the lumberjack has always been an outsize performance instigated by the insecurities of straight, white men, be it 1905 or 2005. But where Brown saw an imminent expiration date for the lumbersexual, it doesn’t appear to be happening any time soon.

As traditional hierarchies very slowly flatten into a more equitable distribution of power across society, the current crisis of masculinity is finding extended life in the backwaters of the internet. And while the real crisis of masculinity is male violence against women, the proliferation of pseudo-intellectual charlatans simultaneously seeding and harvesting the anxieties of young men for their own uses isn’t helping.

Male fragility isn’t going away. Nor is the flannel. Because there’s another performance happening here: different stage, same costume.

***

Back-to-the-land nostalgia has existed in the United States for almost as long as there’s been a United States, at various points manifesting as religious isolationism (think saucer-eyed Protestant sects one valley over), transcendentalist escapism (rich white guys reading poetry in the gloaming), and communitarian anti-capitalism. Its latest incarnation — rooted chiefly in an environmentalism that gestures at change through practice rather than policy — has been about bringing the virtues of the land back to the city, reimagining the frontier as urban rather than rural: a bespoke localism that animates everything from figurative fireside hobbies like pickling and needlepoint to larger-scale industry like rooftop farming, craft-brewing, and restorative, salvage-based building.

But in the same way the “frontier” of the 18th and 19th centuries was a romantic way of describing a slow genocidal war of settler colonialism, so too did gentrification’s border zones, through the mid 1980s to the late 2000s, serve as locations of displacement much more so than the idealized renewal imagined by urban planners. From its early days, gentrification was similarly romanticized with the language of westward expansion, those in its vanguard heralded as “settlers” and “urban pioneers.”

For good or for ill, these “pioneers” — comprised largely of artists in search of an affordable life in the city, abetted by canny real estate speculators — wore the mantle proudly as they built out semi-legal living spaces in (often but not always) sparsely populated post-industrial neighborhoods, sometimes squatting entire buildings. They were essentially homesteading — stealing power from the grid rather than rendering tallow, jury-rigging plumbing instead of digging wells — leading precarious DIY lives based on many of the virtues of the old frontier: resilience, independence, ingenuity, competence.

There was among this early, punk-inflected group of gentrifiers — buried under layers of rebellion and irony — a quiet reverence for working-class utility, often expressed in an aesthetic straight from their stepfathers’ closets: old beat-up boots, blue short-sleeve work shirts (bonus points for actual name tags), paint-spattered coveralls, and … flannel.

This commodification of rural life and labor feels, at best, like a post-industrial Instagram fantasy, personal branding available a la carte or by kit.

Much ink has been spilled on the mass-cultural half-life of flannel, but it wasn’t until the Seattle grunge scene exploded into the mainstream in the early 1990s — with a look that had begun with bands like Minutemen and Minor Threat a decade earlier — that flannel would achieve its high fashion ascendancy, showing up in collections by designers like Alexander McQueen and Vivienne Westwood and never really going away. The aesthetic and political interplay of these subcultures — gay, punk, DIY — would continue through the early 2000s as a youth culture raised on environmental angst looked further into the past for alternatives to the increasingly apparent cruelties of late capitalism, withdrawing to a kind of privileged moral quiet room in the handmade, the local, the slow.

Here then was a hardworking, readymade look, an identifying aesthetic with a notional connection to virtues of self-sufficiency, sustainability, the wild, and, if not out-and-out Luddism, at least an appreciation of analog competence.

But what happens when the performance overtakes the performer, when the flannel habit intensifies from urban axe throwing to rural woodcraft? What happens, in other words, when you finally buy an axe?

Well, it depends on the axe — and the performer, for that matter. If you’re Justin Timberlake, in his Man of the Woods era, the axe in question comes with a private Montana “ranch.” Timberlake, who grew up in suburban Memphis, has lately been performing a return to nature, (nature in this case being the exclusive 15,200-acre Yellowstone Club, a 21st-century millionaire land rush catering to those who want the gated community without having to see the gates). The streamable georgics resulting from this relocation — manifested as the 16 tracks on his February 2018 album, Man of the Woods — reveal little of Timberlake’s relationship to the actual woods (or mountains or fields or wilderness) and present more like a checklist of urban-versus-rural cliché, the kind you might find in the playbook of any halfway decent political operative aiming to divide and conquer. Here are some lyrics from the album’s seventh track, “Supplies”:

’Cause I’ll be the light when you can’t see

I’ll be the wood when you need heat

I’ll be the generator, turn me on when you need electricity

Some shit start to go down, I’ll be the one with the level head

The world could end now, baby, we’ll be living in The Walking Dead

Translation: My hard-won know-how (money) will save us when the poors run out of stuff. (Also, a cavil, but one doesn’t “turn on” a generator like a lamp, one starts it like a lawnmower … and “start me up” would have worked here!) In track 11, titled, naturally, “Flannel,” he sings:

Right behind my left pocket

That is where you’ll feel my soul

It’s been with me many winters

It will keep you warm

Ooh, here’s my flannel

The character’s in the way you wear it

Translation: I wear grandpa shirts and grandpas are good guys. Then, on track 14, “Living Off the Land,” we hear that:

You have to be comfortable with yourself

because that’s all there is

There’s you and nature

Soon as you think you got it all figured out, you know,

the wilderness will figure some way to teach you a lesson

As I’m alone in the forest, I’m one with my surroundings

and there’s a lot of peace in that solitude

I’ll be a mountain man ’til the day I die

 

(Living off the land)

And I break my back

And I work all night

[. . .] I’ll be damned, sometimes it’s hard,

the backed-up bills on the credit cards

Translation: One time I got a little lost on the way to Bill Gates’s cookout. It was tough. And these are the more thematically substantial tracks!

One might find more insight into how the Big West has rubbed off on the Big Pop Star with a quick look at the wilderness-adjacent merchandise from the Man of the Woods Collection, one item for each of the album’s tracks. These include nods to practical Americana like a wool Pendleton blanket, a tin of beard butter, and a trucker vest; objects from the collection that correspond to the tracks above are:

Track 7: A strongbox

Track 11: A flannel shirt, obviously

Track 14: A Best Made Co. felling axe, with custom-painted handle

These items, along with a cooler, a jean jacket, a bandanna, and more, were all available for sale at a Lower East Side pop-up shop the week the album was released, a kind of company store for Timberlake Inc.

As brother to a trucker and an actual lumberjack, it is hard for me to fully understand totems of daily labor so dramatically upsold to “influencers” under the banner of authenticity. But as obvious a target Timberlake is for derision, he’s more of a symptom than he is a cause, one more in a long line of mythologized white men, from Paul Bunyan to John Wayne, out there taming the wild as they tame themselves (but not too much), spokesmodels in the endless ad campaign for America that began with Horace Greeley telling us to go west and live off the land.

And that’s the dream we’re still being peddled, embodied by the upsold axe. That the axe in question is hanging on the wall of a pop-up store in downtown New York creates a particular kind of dissonance: Timberlake Inc. is almost too perfect a microcosm for the stylized repackaging of the outdoors, for the yearning after a frontier that never really existed and the rural “working-class” sensibilities that accompany it. This commodification of rural life and labor — its ruggedness, its whiteness — feels, at best, like a post-industrial Instagram fantasy, personal branding available a la carte or by kit; at worst, it perpetuates pernicious stereotypes, both racist and classist, about natural purity and rural misery, a paradox in service of the powerful.

As brother to a trucker and an actual lumberjack, it is hard for me to fully understand totems of daily labor so dramatically upsold to ‘influencers’ under the banner of authenticity.

But life adjacent to wild spaces — and the work that sustains it — can be good, regardless of your politics. The braiding of masculinity and wilderness is as old as the American frontier, but it’s worth considering how we might untangle the two, worth considering how we might live with the forest world — and all it has to offer us — without destroying it.

***

But maybe you’re not a rich, world famous pop star with a flannel fetish (if you’ve read this far, it’s likely you are not). Sure, axe throwing seems like a fun thing to try, but lately you’ve been spending more time upstate (whatever state that might be) car camping, or staying with friends who’ve left the city; there are campfires, fireplaces, wood to be chopped, logs to split. You are thinking of buying an axe of your own.

Where to start?

There are three basic types of axes you might acquire: a hatchet, for light camp use limbing branches and making kindling (12 to 18 inches long, around 1.5 pounds); an all-purpose camp axe for clearing saplings and light splitting (20 to 28 inches, around 2.5 pounds); a felling axe for chopping down trees (30+ inches, between 3 and 4.5 pounds). Within each of these basic categories there are dozens of varieties, based largely on the regions from which they originate: the Allagash Cruiser, the Hudson Bay Camp Axe, the Dayton Railsplitter, etc.

Whatever you’ve chosen, the first thing you’ll notice is the weight: a multipurpose Swedish forester’s axe — weighing three pounds — is a manageable tool, useful on smaller trees and for light splitting. You’ll probably pick it up by the end of its American-hickory handle using your dominant hand. If you’re lucky, it comes to you as an already well-used and well-loved tool, the wood worn to a tacky smoothness by years of sweat and sap and the occasional reapplication of linseed oil. It will feel heavier than three pounds should.

Next, you’ll probably hoist the heavy end up into the other hand, striking a slightly awkward pose halfway between lumberjack and serial killer.

Perhaps the light will catch the burnished cheek of the blade, and you’ll reach a tentative finger to the hardened edge, which, if properly sharpened, can dry-shave the hairs from your arm. You’ll continue to feel that weight, three pounds starting to feel like 30, and you’ll begin to wonder: What can I chop with this? The axe is one of the oldest tools we have, designed, essentially, by gravity (which does most of the work anyway) — when you pick it up, you’ll want to let it fall.

Let’s say you’re in the woods — on a weekend camping trip or at a friend’s woodsy cabin — so there’s a lot it could fall on. For a first swing, a nice, newly down log is good for practice — in a wild forest, there should be plenty of recently downed deadfall not yet rotten.

You stand square to the log — imagine it as Eastern red cedar, for its intense scent and lurid scarlet heartwood — and raise high the axe. The weight will do the rest. If the swing is true, there will resonate from the tree — through still-growing sapwood to the compressed cells of the dying core — a deeply satisfying, percussive boom, scattering birds and startling deer. The first swing invites another, and then another, until a deep ringing rhythm echoes through the forest. It’s hard work, but in its repetition it is meditative.

That sound, of axe on wood, calls back to a hundred generations of humankind, invites considerations of how our ancestors might have understood their place in a world covered by forest. Sitting there, axe across knees, taking a breather, it’s not so hard to imagine them.

Shaggy Briton woodsmen in the vast pre-Roman forests of Cumbria, gripping their sacred Langdale axes, with glimmering heads knapped from the rare volcanic greenstone mined from the Pike of Stickle.

A barefoot Japanese carpenter moving gingerly across a hinoki cypress swinging his heavy, long-handled masakari, leaving palm-size chips of wood as a massive six-by-six beam reveals itself from the 16-foot log.

A pair of Basque foresters, generations ahead of the chainsaw, laboring astride two great beech trees pulled from deep within the Irati Forest, locked in a traditional aizkolaritza, a village-wide test of strength, precision, and endurance to see who might hew the finest, fastest timber.

Tireless Henderson Islanders squaring off Pacific rosewood, adzes made from giant clamshells, chewing out chocolate shavings from the dark heartwood. 

A thousand miles and a thousand years separate these moments of labor, and at the heart of each, the same basic motion: Pick up the heavy thing and let it fall; let the weight do the work, or at least half of it.

This is the allure of the axe: It is a simple, efficient tool charged with power and violence; it lets us measure our labor swing by swing, as we gather fuel for heat or timber for shelter. To look at a stand of trees, axe in hand rather than chainsaw, is to understand it not as a resource for the coming weeks or months, but for subsequent years and generations. And though the axe confers an intoxicating dominion, over woodlot and wood target both, it is a tool that invites a way of seeing that is very old indeed. The various eras of human prehistory seem named for dynastic families from alien worlds — the Mousterian, the Denisova, the Aurignacian. It is the Acheulean in which early stone hand tools begin to flourish, particularly what is now referred to by paleoanthropologists as the “hand-axe.”

The Acheulean “hand-axe” is not an axe in the modern sense; really, it’s just a big rock with two chipped-off edges, bits of flint “knapped” away to create a biface the better to dig or cut with, to remove bark from a tree or, even, to fell that tree by hand. Perhaps, also, the better to kill with, human history providing no shortage of reminders that any distinction between tool and weapon derives from delusions of civilization. 

The finer specimens of these hand-axes, unearthed across Europe and Africa, from the Fells of Cumbria to the river gorges of the Olduvai Valley, have the shape of great and heavy tears. For centuries, British farmers, turning one up with plough or spade, thought of them as thunderstones, specially formed rocks either dropped from the heart of terrible storms, or seeded deep beneath the earth by lightning strikes, gifts of creation, that man might make better dominion of a world made just for him. 

Hand-axes represent the evolution of a very basic technology, and one can imagine that moment when the blunt rock was discarded for the edged rock, followed quickly by the thought, in not so many words: “What if I made this even sharper?”

And so these rough-hewn stones-as-tools, ranging in size from an iPhone to a toaster, underwent refinement over scores of generations — and with that refinement toward balance and symmetry, they began to take on value, both material and spiritual. Hand-axes, their abundance and quality, became a symbol of wealth, a currency; and those created from rarer elements (the deeper in the earth the better) were revered as religious symbols, not to be used as tools, but rather thought of as we now think of art. As French paleoanthropologist Andre Leroi-Gourhan puts it, in contemplating the unlikely craftsmanship of such early humans:

It seems difficult to admit that these beings did not experience a certain aesthetic satisfaction, they were excellent craftsmen who knew how to choose their material, repair defects, orient cracks with total precision, drawing out a form from a crude flint core that corresponded exactly to their desire. Their work was not automatic or guided by a series of actions in strict order, they were able to mobilize in each moment reflection and, of course, the pleasure of creating a beautiful object.

Though Gourhan is writing about human beings 10,000 years ago, he could be describing a certain strain of contemporary axe maker, for whom an axe is just as at home on a pristine West Village gallery wall as it is in the back of a woodshed.

About a decade ago, Peter Buchanan-Smith, a Canadian designer living in New York City, found himself in need of a hatchet to make some kindling. Looking to grill a choice cut of meat over a hot, wood-fueled fire, Buchanan-Smith found himself unimpressed by the cheap, poorly made imports at nearby hardware stores (dull edges, synthetic handles), so he expanded his search for a better, American-made tool.

The story might have ended there, but shortly after Buchanan-Smith finally did get his hands on a decent axe, he decided to customize the handle in colorful stripes: and just like that, the Best Made Co. was born. (Buchanan-Smith declined to talk to me for this story and is, I’m told, transitioning away from the company.)

Things happened quickly from there. Buchanan-Smith, who’d won a Grammy for his art for a Wilco album cover and who’d done design work for Isaac Mizrahi and David Byrne, was well known among New York’s art and design community, and very soon after the first axe was painted, it was hanging on the wall of Partners + Spade in Manhattan. That was in May 2009; a month later, in anticipation of Father’s Day, the fledgling brand sold out its stock (100 axes) in an hour.

The past decade has been a good one for Best Made Co. with the opening of a flagship store in lower Manhattan, followed by a 2,700-square-foot showroom in L.A.; and on top of their apparent domination of the bespoke axe market, the company has gone all in with a full line of forest-forward gear and apparel. So, if anyone has a full view of the aesthetic arc of lumbersexuality, it’s Buchanan-Smith, who’s described his ideal customer as “Alaskan Charles Eames (rather than Brooklyn Grizzly Adams).” And while someone who relies on tools but also likes good design is certainly cooler than dresses up like someone who relies on tools, it helps that the former usually has a little more money to spend than the latter.

One might wonder how great the difference could be possibly be from one axe to the next, but it only takes an afternoon at the wood pile to appreciate good steel as opposed to bad: the former holds its shape longer, has a stronger edge, stays sharper, and is less prone to chipping or breaking, all of which makes for a safer, more efficient axe. It is taken for gospel — at least on the internet of old guys and their tools — that the older the axe, the better the steel.

You are thinking of buying an axe of your own. Where to start?

If you’re looking, it’s not hard to find someone in just about every rural county in the country with a grinding wheel, a set of files, and a strop, who will take your grandfather’s axe and return it to its former glory. And for every one of those guys there are a hundred others hanging out in online forums asking one another the best way to rebevel the edge on a timber-hewing broadaxe or how to de-pit the cheek of a 100-year-old New Jersey pattern felling axe. (To its credit, Best Made’s L.A. store has a counter devoted to restoring and refurbishing old tools, from cast-iron pans to axes.)

Navigating sites like BladeForums.com and TalkBlade.info, a theme begins to emerge: New, mass-produced things are bad; old, handcrafted things are good. And while there’s an awful lot of grumpy conservatism burbling through these forums, spiked with a mild dose of over-the-counter libertarianism, if you squint past the bumper-sticker usernames and shallow isolationism, the underlying politics run parallel to much of the contemporary green movement, from the embrace of all things local to a rejection of late-capitalist disposability. Granted, from the conservative direction these politics are rooted in a nostalgia that veers into apocalyptic nativism, but it is bewildering to see how similar in outlook — when it comes to craftsmanship, consumerism, conservation — so many people are who otherwise identify with different ends of the political spectrum.

***

Politics doesn’t come up much at my return visit to Kick Axe for the opening of spring league night — it’s likely that the ideological spectrum here is similar to any Brooklyn bar on a Monday evening, which is to say not as liberal as Twitter would have you believe. I sit back and watch 76 amateur axe throwers crowd around league master Anthony Oglesby, who stands upon a stump introducing new rules and reminding competitors of the old, part carnival barker, part vice principal.

There is more flannel in this crowd than the last time I was here, more self-conscious woodsiness expressed through beards and boots, so I’m not exactly sure where Melanie Serrapica fits in. In her late 20s, Serrapica is wearing a semiformal low-cut red dress, and if it weren’t for the custom-painted hatchet she holds lightly in her right hand, its handle a gradient from lustrous black into midnight blue, I’d assume she’d entered the wrong bar.  

“[Axe throwing] is a great way to blow off steam after coming from work, where you want to throw things at people but aren’t allowed,” Serrapica deadpans, despite having to yell over the anticipatory din of her fellow axe throwers. Her friend Sara Morabito nods in agreement. “We’re two nerds who don’t do things other than conventions,” she says, gesturing to her fiancé Chris Knowles. “This was the first athletic thing where we were both like, ‘We’re really good at this.’ It’s a great thing to do together.”

Like Serrapica, Morabito and Knowles fell hard for the pleasures of axe throwing, and also have their own custom axes (hand-painted by fellow league member, Tommy Agniello) — unlike Serrapica, they have yet to name their axes. “Yeah, I named it Axe-Po,” Serrapica says. “You know, like B-MO from Adventure Time?” (I don’t.) As the subject turns to axe care and sharpening technique, I ask the trio why they think axe throwing has become so popular. Chris (who favors a double-grit sharpening puck for maintaining his blade) gets to the heart of it: “It’s something that feels masculine and outdoorsy, and I think people are looking for that.”

This is the allure of the axe: It is a simple, efficient tool charged with power and violence.

 

You don’t need a gender studies degree to understand that ideas of masculine and feminine exist on a spectrum that doesn’t map across a male-female binary; in fact, the league crowd is as diverse in gender as you’d expect of a bar in Brooklyn on a Monday night. As I circulate among teams with names like Inside the Axer’s Studio, Axes of Evil, and Well, Axetually, interrupting people as they get in a few more practice throws before the competition starts, one name keeps coming up: Rebecca. The best. Unbeatable. Rebecca is the best axe thrower. “Number one last season, and the season before.” Nobody knows if she’s coming tonight, nobody seems able to spot her or her girlfriend in the crowd. Someone thinks she might have moved upstate, “to be closer the woods,” and I can’t tell if they’re fucking with me. She’s already a legend, the more so in her absence.

People are drinking — each league night has its own beer sponsor — and it gets noticeably louder as the new season begins, the title wide open and up for grabs in this new and Rebecca-less reality. Soon into it I notice a woman pressing a call button next to her range, an intense look on her face: It’s too early for a wood replacement on the target, so she’s looking for a judgment. An axe-pert calls the league-master over, and all parties approach the target, like lawyers approaching the bench, to peer and point at an axe stuck just off the bull’s-eye. League-master Anthony waves over at Kick Axe’s manager, Nic Espier, who, with his suit and his earpiece looks like he’d take a bullet if ordered to, goes over to settle the issue.

“Seven points decided last year’s title,” he tells me, after judging in favor of the button-pusher. “These guys look like they’re having fun, but they take it pretty seriously.”

The pleasures of axe throwing or wood splitting or tree felling aren’t for everyone — nor, indeed, are they available to most. But it would be a shame to dismiss these things we yearn for — open spaces, wilderness, a particular kind of labor — simply because we’ve had them so relentlessly repackaged and sold back to us.

So let the axe be many things — tool, work of art, diversion — but let it also be a way back into the forest. Let this very old machine remind us of our limits and show us not what is ours to use, but ours to preserve.

***

Jonny Diamond is a writer and editor who splits his time between New York City and the Hudson Valley. His fiction and nonfiction has appeared in The Missouri Review, Geist, Hobart Pulp, Rolling Stone, Literary Hub, and elsewhere. He is currently working on a book-length object history of the axe, part investigation of its symbolism in America’s westward expansion, part interrogation of contemporary tropes of masculinity and wilderness. He is the editor-in-chief of LitHub.com

Editor: Kelly Stout
Fact checker: Ethan Chiel
Copy editor: Jacob Gross

Total Depravity: The Origins of the Drug Epidemic in Appalachia Laid Bare

Getty / Black Inc. Books

Richard Cooke | Excerpt from Tired of Winning: A Chronicle of American Decline | Black Inc. Books | May 2019 | 21 minutes (5,527 words)

They shall take up serpents; and if they drink any deadly thing it shall not hurt them; they shall lay hands on the sick, and they shall recover.

Mark 16:18

One night John Stephen Toler dreamed that the Lord had placed him high on a cliff, overlooking a forest-filled valley. He had this vision while living in Man, West Virginia, where some of the townsfolk thought he was a hell-bound abomination; he countered that God works in different ways. The mountains were where he sought sanctuary, so he felt no fear; but as he watched, all the trees he could see were consumed by wildfire. It was incredible, he said, to see ‘how quick it was devoured’, and the meaning of the parable was clear. The forest was Man and the fire was drugs, and when the drugs came to Man, that was exactly how it happened – it was devoured ‘so fast, that you didn’t even see it coming’, he said. We were in Huntington, West Virginia, and by now John Stephen Toler was in recovery.

Read more…

An Audience of Athletes: The Rise and Fall of Feminist Sports

womenSports, Bettmann / Getty

Britni de la Cretaz | Longreads | May 2019 | 26 minutes (6,609 words)

The idea for womenSports magazine was born in a car suspended over the San Francisco Bay by beams of steel. Several weeks before she captivated the nation by beating Bobby Riggs in the “Battle of the Sexes” tennis match in the fall of 1973, Billie Jean King sat in the passenger seat of a car and stewed. At the wheel was her then-husband, Larry, driving the couple from Emeryville near Oakland toward San Francisco on the Bay Bridge, and as Billie Jean flipped through an issue of Sports Illustrated, she complained, which is what she always did whenever she picked up an issue of SI. Read more…

The American Worth Ethic

Getty / Photo Illustration by Longreads

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)

“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.

The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.

Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

* * *

Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”

Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”

It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.

If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.

Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.

That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.

* * *

The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”

The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.

These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.

Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.

Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.

We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?

We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.

Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.

* * *

The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.

The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.

While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.

Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.

Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.

* * *

Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.

Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.

We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.

Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.

The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.

* * *

Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.

Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross   

When Zora and Langston Took a Road Trip

Library of Congress / Corbis Historical / Getty, Michael Ochs Archives / Getty

Yuval Taylor | An excerpt from Zora and Langston: A Story of Friendship and Betrayal | W. W. Norton & Company | March 2019 | 30 minutes (8,692 words)

 

Ornate and imposing, the century-old Gulf, Mobile and Ohio Passenger Terminal in downtown Mobile, Alabama, resembles a cross between a Venetian palace and a Spanish mission. Here, on St. Joseph Street, on July 23, 1927, one of the more fortuitous meetings in American literary history occurred, a chance incident that would seal the friendship of two of its most influential writers. “No sooner had I got off the train” from New Orleans, Langston wrote in The Big Sea, “than I ran into Zora Neale Hurston, walking intently down the main street. I didn’t know she was in the South [actually, he did, having received a letter from her in March, but he had no idea she was in Alabama], and she didn’t know I was either, so we were very glad to see each other.”

Zora was in town to interview Cudjo Lewis, purportedly the only person still living who had been born in Africa and enslaved in the United States. She then planned to drive back to New York, doing folklore research along the way. In late 1926, Franz Boas had recommended her to Carter Woodson, whose Association for the Study of Negro Life and History, together with Elsie Clews Parsons of the American Folklore Society, had decided to bankroll her to the tune of $1,400. With these funds, Zora had been gathering folklore in Florida all spring and summer. As the first Southern black to do this, her project was, even at this early stage, clearly of immense importance. It had, however, been frustrating. “I knew where the material was, all right,” she would later write. “But I went about asking, in carefully accented Barnardese, ‘Pardon me, but do you know any folk-tales or folk-songs?’ The men and women who had whole treasuries of material just seeping through their pores, looked at me and shook their heads. No, they had never heard of anything like that around there. Maybe it was over in the next county. Why didn’t I try over there?”

Langston, meanwhile, had been touring the South for months, penniless as usual, making some public appearances and doing his own research. He read his poems at commencement for Nashville’s Fisk University in June; he visited refugees from the Mississippi flood in Baton Rouge; he strolled the streets alone in New Orleans, ducking into voodoo shops; he took a United Fruit boat to Havana and back; and his next stop was to be the Tuskegee Institute in Alabama. It was his very first visit to the South.

When Zora invited him to join her expedition in her little old Nash coupe, nicknamed “Sassy Susie,” Langston happily accepted. (The car looked a lot like a Model T Ford, and could only seat two.) Langston adored the company of entertainers, and Zora was as entertaining as they came. Langston did not know how to drive, but Zora loved driving and didn’t mind a whit. They decided to make a real trip of it, “stopping on the way to pick up folk-songs, conjur [sic], and big old lies,” as Langston wrote. “Blind guitar players, conjur men, and former slaves were her quarry, small town jooks and plantation churches, her haunts. I knew it would be fun traveling with her. It was.” Read more…

Twitter Won’t Miss You: A Digital Detox Reading List (and Roadmap)

Follow the crowds to a world with less screen time. (Photo by davity dave via Flickr, CC BY-SA 2.0)

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories. 

Have you ever needed a break, but just not known from what? Everything seems fine…ish. Your job is OK, your friendships are all right, your health is decent, nothing dramatic to report. And yet, you’re stressed. Dissatisfied. Bored. Sometimes you even feel exhausted and overwhelmed. Maybe you should distract yourself by looking at Instagram. Maybe you should find someone with whom to argue on Twitter. Maybe you should see what your ex is up to on Snapchat.

Or maybe you should get the hell off social media for awhile.

At least, that’s the prescription issued by an increasingly vocal crowd of psychiatrists, psychologists, sociologists, writers, philosophers, performers, and general opinion-havers. The common term is “digital detox,” whereby an individual commits to a cessation of specific actions on one’s Internet-enabled devices for a finite period of time. One can go on this adventure with friends, family, or a likeminded group of strangers from, you guessed it, the internet.

I’ve been an enthusiastic and sometimes addicted social media user since approximately 2003. But after beginning my research for this column, I went on a digital detox of my own. It is small and manageable, and nothing so impressive as author Cal Newport’s suggested 30-day detox from all nonessential online functions. But it has improved my life already in measurable ways. Here are some writers whose approaches to their own vacations from the Matrix helped me shape mine.

1. “Unplugged: What I Learned By Logging Off and Reading 12 Books in a Week.” (Lois Beckett, The Guardian, December 2018)

Beckett nabbed what must’ve been the plum journalistic gig of the year: head to a tiny cabin in the foothills of the Sierra Nevada, and read. Books. Made of paper. “This was a perfect assignment,” she writes. “For journalists on many beats — including mine, which includes the far right and gun policy — it had been a year of escalating violence during which conspiracy theories had moved into the mainstream.” And off she went, blissfully unencumbered by wifi. She brought a stack critically acclaimed books purchased at different independent bookshops and a plan was to read 30 books in a week, a number that sounds patently insane to me. She read 12. I’m still impressed — and envious.

The ensuing story is littered with gentle shade, which I always appreciate, and she’s a damn good writer: “I was not going to finish all 30 books at any cost, skimming to the right section of the right chapter in order to say one smart thing — in the U.S., we call this skill a ‘liberal arts education’ — but instead wanted the books’ authors and their protagonists to collide and argue with each other, to give me some different understanding of what had happened in 2018.”

2. “#Unplug: Baratunde Thurston Left The Internet For 25 Days, And You Should, Too.” (Baratunde R. Thurston, Fast Company, June 2013)

I adore my longtime friend Baratunde, though perhaps not as much as my mother, who has met the man twice and still has a copy of his 2013 Fast Company cover story somewhere in her house. He’s a great human.

And now that we’ve established my utter lack of objectivity, let’s hear from his 2013 self: “I’m an author, consultant, speechifier, and cross-platform opiner on the digital life. My friends say I’m the most connected man in the world. And in 2012, I lived like a man running for president of the United States, planet Earth, and the Internet all at once.” That very accurate description is exactly why it was so interesting that Baratunde Rafiq Thurston, of all freaking people, did a digital detox.

At the time, I remember worrying that he might burn out or possibly just suddenly up and die due to lack of sleep, so it was clearly a good move. I can’t imagine replicating what he did (no email?!), but since he was self-employed with a personal assistant and has an incredible amount of willpower, he was able to pull it off. His nine-point digital detox preparation checklist is incredibly helpful, and I intend to use it the next time I do one. My favorite line? “She transmitted this data by writing down the names on a piece of paper.” And yes, he was happier and healthier by the end of the experience. To this day, he goes on regular social media vacations, and I believe he’d tell you his life is better for it.

3. “Quit Social Media. Your Career May Depend On It.” (Cal Newport, New York Times, November 2016)

“I’m a millennial computer scientist who also writes books and runs a blog,” Newport writes. “Demographically speaking I should be a heavy social media user, but that is not the case. I’ve never had a social media account.” Newport lays out in plain, accessible language the notion that social media distracts from good work because it is designed to be addictive. It’s a notion with which I agree, based in no small part on my own lived experience; I have no doubt my writing output has suffered as I’ve devoted more and more time to social media. As Newport writes, “It diverts your time and attention away from producing work that matters and toward convincing the world that you matter.”

4. “Cal Newport on Why We’ll Look Back at Our Smartphones Like Cigarettes.” (Clay Skipper, GQ, January 2019)

Fast forward two and a half years. Newport, by now an in-demand speaker and author of two books — the latest is Digital Minimalism: Choosing a Focused Life in a Noisy World — expands on his November 2016 Op-Ed. Newport is a reluctant self-help guru who would undoubtedly reject that label. In this interview (as in the one I heard with him on fellow PoB (Pal of Baratunde) Lewis Howes’s podcast “The School of Greatness”), Newport stresses that he doesn’t typically offer a program or prescription. However, his recommendation for a 30-day digital detox seems simple in concept and necessarily jarring to execute: one dispenses with all digital products that are unnecessary to one’s career and personal health. Check your work email and log into your bank app to ensure a direct deposit has gone through, but let Facebook, Twitter, and Instagram accounts lie fallow for 30 days. Skipper is an able interviewer and Newport is a clear, experienced, and intelligent interviewee.

5. I Quit Social Media for 65 Weeks. This Is What I Learned. (Kareem Yasin, Healthline, February 2018)

Yasin interviews David Mohammadi, who left social media for over a year and loved the experience. A newly minted New Yorker, he abandoned the online pseudo-friendship industrial complex because he was worried he’d obsess over what was happening back in San Francisco. And he had good reason to suspect he’d be homesick — he’d tried the East Coast thing once, been endlessly captivated by his Bay Area friends’ Facebook updates, and ended up moving back to San Francisco. Years later, a more mature Mohammadi quit his job and decided to start a new career in New York with a clear mind unclouded by social media-induced FOMO. You likely won’t be surprised to hear his take: “The first week was hard. The second week was nice. And as I got closer to the end date, I just was like: ‘Wow. It feels great to be so present, and not just on my phone.’” But the benefits didn’t just extend to mental health — he made more money, too! Yasin writes, “Working as a boutique manager, [Mohammadi] noticed how his coworkers would constantly check their phones. Those two-minute breaks from the real world robbed them of opportunities to get more commissions — opportunities that would be theirs if they would just look up and notice the customers.”

* * *

Like you, probably, I have a personal Instagram account. Except it isn’t personal, really — with 14,200 followers, it is ostensibly a way to cultivate and grow an online brand based on me, myself, and I. I write essays and books; I do comedy shows; I lecture on mental health awareness at colleges; I pop up as a talking head in various capacities in various venues. Like you, probably, I want to be seen as an attractive person, so sometimes I use filters or put on more makeup than is absolutely necessary for a selfie. Like you, probably, I want to be seen as a capable person worthy of being hired, so I do my best to seem witty and fun but chill, man. Given that I want to write more for television and that a lot of my work falls under the category of “entertainment,” I have followed the conventional thinking in my industry, which boils down to “Always be selling (yourself).”

This thinking extends to my “personal” Twitter account (77,400 followers), despite my many qualms about the ethics of its overseers with regard to threats and harassment. It extended to my Facebook fan page, until I quit Facebook altogether because I don’t care what my least-favorite racist relative ate for breakfast — if I want to know what’s up with a boring person from high school, I’ll make private inquiries. When the current Russian government really loves something, I have to ask myself if I need that something in my life. (Note: I am aware that Facebook owns Instagram, and that I’m a hypocrite sometimes.)

Then there’s the Instagram account for my podcast (679 followers) and the Twitter account for my podcast (457 followers) and the Instagram account for my progressive lady-coat art project (26,200 followers). I don’t use Snapchat, because once I joined for 24 hours and my drunk friend sent me a dick pic framed by monogrammed his-and-hers towels in the master bathroom he shares with his girlfriend; I’m a Scorpio, and pseudoscience and common sense immediately told me the power of the Snap was too great for my personal constitution to handle. I also recently joined a few dating apps. And that led to more swiping, more clicking, more texting, more aggravation of writing-induced carpal tunnel issues. When an ex-NFL star asked me on what I’m sure would have been a super safe and not-gross date to his house at 3 a.m., I decided that Tinder was also too much for me.

At this point, and considering my sore wrists, the signals seemed to say, “SARA. TAKE SOME TIME OFF THE SOCIAL MEDIA.” I had 104,000 followers across social media, some of whom were double or triple followers and some of whom were robots, and while I loved each of them like my very own imaginary baby, Mommy needed a vacay.

First, I enabled the Screen Time function on my phone and discovered that I use it, on average, over seven hours a day. This horrifying fact led me to design the parameters of my moderate digital detox: I’d continue to use my email for work and social reasons. I would continue to use Twitter, but only to share my work or the work of a friend or charity. I would post a note announcing that I was taking an Instagram break until April 9, the day the second season of my podcast debuts, to give both a heads up to any former professional athletes that I wouldn’t be interacting with them there and to announce the premiere date. I would text when I felt like it, but leave my phone facing down when I wasn’t using it. I would remove Instagram from my phone, just as I’d done with Twitter months prior. At night and during my daily meditation practice, I would put the phone on airplane mode.

Following those simple rules, and only occasionally breaking them, I managed to reduce my phone time by 10 percent in the first week. I resumed the regular at-home yoga practice I’d attempted a month prior. I finished the outline of an hour-long TV drama pilot. I went on actual face-to-face dates with humans during daylight and appropriate evening hours. I visited with two friends. I got the “annual” physical I’d put off for two years. And I wrote this column.

While I intend to resume using Instagram on April 9, I will do as Cal Newport recommends: use social media like a professional, for specific purposes, and do not stray from said purposes. Twitter and Instagram will remain places for me to share my work and the work of friends and charities I admire. Sometimes, I will use these places to discover great writing, music, and more. Moving forward, I want to reduce my screen time by 10 percent each week until I average under four hours per day on my phone — and then I’ll try to reduce it even more.

I’m pleased with my progress. It may seem meager, but it’s a start. And I feel better already. So if you’ve considered quitting social media but have some qualms, do what I did: start small. Pop your head above the churning surface of our wild, untrammeled internet, and take a look around. Stay awhile. Your eyes will grow accustomed to real sunlight soon enough, and it’ll be easier to breathe. It’s pretty nice up here.

* * *

Sara Benincasa is a stand-up comedian, actress, college speaker on mental health awareness, and the author of Real Artists Have Day JobsDC TripGreat, and Agorafabulous!: Dispatches From My Bedroom. She also wrote a very silly joke book called Tim Kaine Is Your Nice Dad. Recent roles include “Corporate” on Comedy Central, “Bill Nye Saves The World” on Netflix, “The Jim Gaffigan Show” on TVLand and critically-acclaimed short film “The Focus Group,” which she also wrote. She also hosts the podcast “Where Ya From?”

Editor: Michelle Weber

Health Care Sponcon: Where Big Pharma Meets Instagram Influencer

Photo via Pexels

I’ve been reading about Instagram influencers of all flavors recently, from kid stars to travel bloggers. Enter the latest type of influencer marketing: health care sponcon. That’s right: pharmaceutical companies and Silicon Valley health startups are teaming up with social influencers to sell new drugs and medical devices.

“There is no doubt that this type of health care advertising-cum-storytelling is effective, and is frequently compliant with federal regulations,” writes Suzanne Zuppello. But is it ethical? For Vox‘s The Goods, Zuppello digs into influencer pharma marketing and investigates how the FDA and FTC are attempting to regulate this type of sponsored content.

Lesley Murphy, a former contestant on The Bachelor and current travel blogger, uses her platform to disseminate information that benefits people like her who are affected by a BRCA genetic mutation, which increases a person’s risk of breast, ovarian, and pancreatic cancers. Murphy, who did not respond to requests for comment, documented her experience of undergoing a preventive double mastectomy on Instagram. Now she advertises ReSensation, a surgical technique launched in October 2018 that may help women undergoing breast reconstruction to retain some or all sensation in their breasts, to her 422K followers. Although ads for most surgical procedures are under the FTC’s purview, ReSensation’s use of human nerves also gives the FDA jurisdiction over Murphy’s Instagram and blog posts.

When asked how the influencer program was developed, Annette Ruzicka, a spokesperson for AxoGen, the company that developed ReSensation, said, “The only request of contributors was to write openly about their breast reconstruction process, and to also share factual information with their followers about the ReSensation technique. We shared publicly available information about the ReSensation technique to ensure that all content shared with the public was accurate. We provided no other content requirements for contributors.”

Murphy, who is not the only ReSensation influencer, has not undergone the procedure herself. But her followers may not realize this detail until they reach the end of her Instagram caption, where she directs readers to a blog post where, at the very end, she discloses her personal inexperience with the technique. Though this does not violate federal guidelines, nor those put forth by AxoGen, it does speak to the ethical obligation an influencer has to their followers.

The reality star’s Instagram post about the technique received almost 11,500 likes, giving ReSensation considerable exposure, yet Murphy omits disclosures required by both the FTC and FDA. She uses the term #partner to disclose that she is a compensated influencer, but the term is considered too vague, even for the FTC, for a user to clearly understand the relationship. She also fails to offer any information about the technique, disregarding federal guidelines to disclose risks and benefits that may impact patient decision-making. Instead, she directs followers to her blog where she discusses “a new technique designed to restore sensation in breasts after surgery,” lamenting the numbness in her breasts since her mastectomy and reconstruction.

Her blog post is where we finally learn the technique was not used on Murphy and cannot be used in conjunction with implant reconstruction, the most common and least complicated form of breast reconstruction, and the type of reconstruction Murphy underwent. Neither Murphy’s posts nor the ReSensation website discloses the success rate of the technique, instead focusing on an insecurity that has plagued mastectomy patients for decades: numb breasts.

Read the story

Labor Pains: A Reading List

A doctor examines a pregnant woman in Allahabad, India, 2011. (AP Photo/Rajesh Kumar Singh, File)

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories. 

Prior to researching this column, I felt no significant babymaking desire tugging at my uterus. This is not to say I have not thought of being a mother or a stepmother. Adoption and foster-to-adopt programs have always held a special fascination for me, even when I was a little kid. But the biological mechanics of what happens at the end of the human assembly line — you know, the manner in which the finished product exits the factory door? That always freaked me out.

According to my mother, Child Me reacted to the discussion of labor and delivery with disinterest at best and revulsion at worst. Mom worried that she’d somehow made me afraid of it. In fact, she had not; she’d always spoken of pregnancy as the happiest time of her young life, and had two relatively swift and uncomplicated deliveries with healthy babies. When she was 24, I woke her up at 1:00 a.m. one October morning and was out in the world by a quarter past four, taking the traditional route. When she was 27, my brother took maybe six or seven hours on a Sunday in early December. She said he “shot out like a football.” I never knew how to react to that, and I still don’t.

As a child, I asked her how painful it was. She said, “Kind of like… having to do number two in a really big way.” She has since admitted this was an understatement, though one often does go number two when one does a vaginal delivery, but says “it wasn’t that bad” and “at the end you get a beautiful baby!”

My mother accepted long ago that making babies was not high on my priority list. She always encouraged my career and creative aspirations. I give her a lot of credit for not pressuring me about it like some women’s mothers do. I’ve told her that I just don’t have baby fever.

But then I researched this column.

And now…

Well, aside from abstinence from sexual intercourse, there is no greater method of birth control than reading birth stories. Add articles about labor and delivery as managed by the medical industry in the United States, and you’ve got a cocktail that should be nearly as effective as the common oral contraceptive.

My hat is off to women who go through with having a baby — and especially those who choose to do it again. That’s wild, lady! But as you’ll see from the stories I’ve collected below, some labor and delivery experiences are less than ideal, to say the very least. I’m glad real women share what really happens to them rather than glossing it over with some fairy tale bullshit. More real stories from real women who don’t pretend everything is easy, please. And more reporting on the way Black women and poor immigrant women are consistently offered a lower standard of maternal healthcare.

1. “I Think, Therefore I Am Getting The Goddamned Epidural” (Rebecca Schuman, Longreads, November 2017)

I despise every hippie braggart Schuman cites from Ina May Gaskin’s creepy-sounding books Spiritual Midwifery and Ina May’s Guide to Childbirth. At one point I also wanted to lightly smack her husband and kick the shit out of her anesthesiologist, though probably not as much as she did.

Dads make mistakes. It is a fact that my dad is awesome and also that while I was being born, he walked into the wrong labor and delivery room, misreading the name on the door. He did not recognize the gaping vagina before him and swiftly made his exit. During my mother’s second delivery experience, with my younger brother, he pissed her the fuck off by a.) complaining about the room temperature and opening the window when she was fucking cold and b.) bringing in a TV so he and the doctor and any orderlies could watch the game. But he turned out to be a splendid dad.

(As for a similar redemption for Schuman’s shitty, bored, Instagram-scrolling anesthesiologist, I have less hope. I’ve always regarded anesthesiologists as the groovy magicians of surgery — they show up, make your life better — or worse, if they want! — and then disappear. This gal seems to have gone to the wrong wizarding school.)

Schuman, who is one smart cookie, talks about Descartes in an accessible way and connects him quite easily to birthing:

“But what then am I?” he asked. “A thing which thinks. What is a thing which thinks? It is a thing which doubts, understands, [conceives], affirms, denies, wills, refuses, which also imagines and feels.” These might not seem to be questions (or answers) that one naturally associates with the act of giving birth, but perhaps they should be. The midwives in my books were asking versions of these questions, after all, and they shouldn’t be the only ones who got to. Indeed, what makes all that mother-Goddess-yoni-orgasm stuff disquieting is not actually its medical dubiousness. It’s the decidedly un-philosophical certainty of the operation.

If I still drank, I would toss back some bourbon with Schuman (though not if either of us were pregnant, obviously). Regardless, I would like to buy her a beverage or a large carbohydrate-based baked substance one day.

2. “The Lavender Room” (Cheryl Strayed, Slate, April 2014)

Cheryl Strayed had an ideal situation: the desire for a baby, good health, access to excellent care. Then she labored for 43 hours and pushed an 11-pound kid out of her undercarriage. I have no words other than “holy shit, what a warrior.” She is very encouraging of other women having their baby the way they want, which makes this a very sweet and loving story. When she mentions laboring while asking her deceased mother to help her, I got teary-eyed.

It also reminded me of how long labor can take. My sister-in-law and younger brother texted me a few hours after her water broke on a Sunday afternoon. I felt sure the baby would be there by the time I arrived to New Jersey on a flight from Los Angeles the next afternoon. Nope! I visited the hospital room, drank margaritas at the Stuff Yer Face in New Brunswick, New Jersey with the other aunties and an uncle and got a full night’s sleep before I finally woke up to the news that a child was born unto us. Now we are all obsessed with him and his favorite song is “Psycho Killer” by the Talking Heads. He is 17 months old and looks like Wallace Shawn.

3. “I’ve Given Birth 4 Very Different Ways – Here’s What I’ve Learned” (Laurie Batzel, PopSugar, June 2018)

I think I love this woman. She curses way less than I do but she does not pull punches.

I’m a former ballet dancer and have performed in blood-soaked pointe shoes through severe sprains and other sundry injuries. My pain tolerance is not insignificant. But there is no pain on earth like having a baby. When the nurse told me it was too late for an epidural, I would have sobbed if I’d had the strength. I had marched around the labor and delivery unit for three hours straight to avoid Dr. Jerk, I hadn’t slept in over 36 hours, and, as badly as I wanted the “traditional” birthing experience, I would have performed my own C-section right then and there to make the pain stop. Seriously, it’s a good thing there were no spare scalpels, letter openers, or jagged shoelace tips lying around, because I would have gone rogue in a heartbeat.

She had two C-sections followed by two VBACs (vaginal birth after Caeseran). She also says that if a guy tries to convince you that passing a kidney stone is as painful as giving birth with no drugs, you can punch him “in the biscuits.” Starry eyes over here! She concludes with the very kind sentiment “there’s no wrong way to become a mother.” What a refreshing antidote to some of the “you must have a vaginal birth with no drugs so that you can be a true woman” bullshit I read while looking through articles.

4. “Lost Mothers” (ProPublica, 2017-2018)

In publishing, any subject can become a trend, a flash in the pan, a momentary topic of national chatter. Sparked in no small part by Serena Williams talking to Vogue about nearly dying after the birth of her daughter, 2018 saw more mainstream publications begin to cover the topic of maternal mortality among Black women. But organizations like ProPublica, NPR, and smaller independent publications had addressed the issue previously, and Black women themselves had been speaking up about it for years.

It is incumbent upon reporters at mainstream publications to continue to report on this humiliating and devastating national health crisis. In the meantime, ProPublica did the legwork with a series of articles about the many, many Black women who experience a ghastly standard of maternal healthcare in the United States.

5. “I Was Pregnant and in Crisis. All the Doctors and Nurses Saw Was an Incompetent Black Woman” (Dr. Tressie McMillan Cottom, Time, January 2019)

This story is vivid and it is horrifying and it is heartbreaking. Read every word of it. Here are a few: “When the medical profession systematically denies the existence of Black women’s pain, underdiagnoses our pain, refuses to alleviate or treat our pain, healthcare marks us as incompetent bureaucratic subjects. Then it serves us accordingly.”

6. “Why does it cost $32,093 just to give birth in America?” (Jessica Glenza, The Guardian, January 2018)

These statistics are stark. Writes Glenza:

Despite these high costs, the US consistently ranks poorly in health outcomes for mothers and infants. The US rate of infant mortality is 6.1 for every 1,000 live births, higher than Slovakia and Hungary, and nearly three times the rate of Japan and Finland. The US also has the worst rate of maternal mortality in the developed world. That means America is simultaneously the most expensive and one of the riskiest industrialized nations in which to have children.

So we’re paying the most in the developed world for the shittiest treatment in the developed world? Okay, makes sense. No wonder so many women reject the conventional medical approach to birth and buy into comforting “orgasmic birth is possible, babies just slip right out, pain is all in your mind and was put there by The Man, also buy my book and taint moisturizer” pseudoscience, rocketing from one extreme to the other.

As with anything else, it seems, a complementary medical approach is best, blending conventional medicine with alternative or “traditional” healing techniques. But while my complementary medical idea sounds delightful if you can afford to pay out of pocket, how may health insurance plans will pay for your midwife, doula, obstetrician, nurses and 1+ nights stay at some swanky, soothingly lit spa retreat? Oy vey, what a mess.

* * *

The other ways to obtain a beautiful baby without almost certainly going number two in the process have always seemed the more palatable options to me. Of course, the headaches and heartbreaks possible with adoption and foster-to-adopt are innumerable. Taking on the huge responsibility of parenting does not seem simple — nor should it, I suppose.  Plenty of abusive, nasty jerks have kids, and I rather wish they’d give up for fear of poop on the delivery table or too many forms at the agency.

I may yet become a mother. I don’t know. At present, I am glad to be an aunt; I am glad to entertain my friends when they have kids, or to entertain the kids so that my friends can use the toilet in peace or take a nap. I feel enormous gratitude that generations of American women have fought to ensure that women of childbearing age have rights and protections that were unthinkable years ago — as well as the right to prevent or terminate a pregnancy.

I feel energized to work harder to ensure better access to healthcare for all women, and to help make certain motherhood remains a choice. I should say “biological reproduction” because, as Batzel wrote, “There’s no wrong way to become a mother.”  And of course I know — and you now know I know – it is fine to choose to go without children. You’ll sleep more and save money, much of which you can spend spoiling other people’s kids. I can’t recommend that enough.

* * *

Sara Benincasa is a stand-up comedian, actress, college speaker on mental health awareness, and the author of Real Artists Have Day JobsDC TripGreat, and Agorafabulous!: Dispatches From My Bedroom. She also wrote a very silly joke book called Tim Kaine Is Your Nice Dad. Recent roles include “Corporate” on Comedy Central, “Bill Nye Saves The World” on Netflix, “The Jim Gaffigan Show” on TVLand and critically-acclaimed short film “The Focus Group,” which she also wrote. She also hosts the podcast “Where Ya From?”

Editor: Michelle Weber

Theatre of Wokeness

Illustration by Katie Kosma

Danielle A. Jackson | Longreads | January 2019 | 7 minutes (1,942 words)

There’s a certain kind of conversation everybody seems to be having right now. It takes place most often online, but sometimes in real life. Specifics vary, and its frequency and level of intensity ebbs and flows with the news cycle. An awards show, a White House firing, a video of police misconduct, a local ballot initiative on medical marijuana — anything tangentially related to race or gender can be fodder. It starts out engaging enough. Then tensions mount; participants morph into archetypes. Its substance diminishes into the reduced, neutered language of the “moment” before disintegrating altogether.

In a would-be map of this phenomenon, the first Women’s March, held the day after President Trump’s inauguration, is an inflection point. On November 9, 2016, Teresa Shook, a white former attorney living in Hawaii, created a Facebook event for “a women’s march” that quickly drew several thousand RSVPs. Shook quickly enlisted a small group of women to help with early planning. Organizers were frightened the incoming administration would “threaten access to women’s healthcare, erode protection against sexual violence and roll back aid to struggling mothers.” Shook felt “shock and disbelief that this type of sentiment could win,” she told Reuters. “We had to let people know that is not who we are.” Yet, Trump’s victory wouldn’t have happened without heavy support from white women in the electorate. Terms like “intersectionality” entered the mass media’s lexicon to help explain the difficulty inherent in assembling women into a voting bloc. Along with the election’s results, the terms proliferated in a major way via Instagram, hashtags, and memes.

The march’s founders and early organizers soon appointed a diverse cadre  of women to leadership, with assistance from activist and political connector Michael Skolnik. The organizers also made sure an anti-racism agenda was part of their framework. Pulled together in just a few short months, the March was a resounding success. The central protest, in Washington, drew an estimated half a million attendees (yielding more than a million rides on DC’s Metro, the second largest crowd in its history, after the first inauguration of Barack Obama). When counting the well-attended “sister marches” held around the country, “1 percent to 1.6 percent of the U.S. population” participated in a demonstration, reported the Washington Post.

It isn’t exaggerating to say people who weren’t before are now concerned about race and social justice. According to a CNN / Kaiser poll, 49% of Americans said racism is “a big problem” in 2015, up from just over a quarter who said so in 2011. Gender inequality, too, seems top of mind: A Pew Research Center survey from 2018 said about half of Americans think men getting away with sexual harassment or assault is “a major problem.”

Some say we’re living through “a moment,” that we’re “having a reckoning.” I have a hard time with those words — they’re soundbite-y, naïve, and incomplete, as if the “moment” is for people who hadn’t even had to think about inequality or dealt with it in any large or small way — being followed around a store, or subjected to different standards on a job, or denied an apartment for no obvious reason. And if that’s the case, how’s it different from any other moment? Does it hold up, withstand rigor, or is it a surface-level reckoning, concerned with optics and the appearance of social justice and equality?

The Women’s March’s leaders have had to answer such questions. Under charges of administrative mismanagement as well as anti-Semitism, due to its alleged negligence toward Jewish women and interactions with the Nation of Islam and Louis Farrakhan, some leaders and sister groups have split off from the central organizing body. Last August, Black Women’s Blueprint, a Brooklyn-based organization focused on policy advocacy and grassroots organizing, wrote Women’s March, Inc. an open letter: “Rather than rubbing elbows and entreating known misogynist leaders… we charge you to meet us in the trenches.” Hastily organized and orchestrated in pursuit of an of-the-moment illusion of inclusion, or what I’ll call a “theatre of wokeness,” the Women’s March may be in danger of imploding. In November, the founder, Shook, called for all four co-chairs to step down, and over the past few weeks (leading up to the third march, taking place January 19), several former sponsors and partners walked away from the March, including the Southern Poverty Law Center, EMILY’s List, and the Democratic National Committee.

Along with institutional and personal reckonings, our “moment” has also birthed a category of creations and products that support, mirror, and mine it. Sitcom episodes, satirical bits, comedy specials, films, and music, and other performance art across and in between genres and mediums have attempted to mimic and explore our confusion, our dinner table banter, the rhythm of our outrage cycle, our anxieties, awakenings, and incipient healing. It’s a prolific time. The results, for me, have been mixed; sometimes, in an attempt to titillate or provoke, characterization, interiority, or reflection gets lost or weighed down in favor of an appropriate level of wokeness. Other times, I’ve questioned the motives of the creators, wondering if staying current and in tune with the “moment” is what it’s all about after all. More than anything I wonder what the whole point is of the reckoning. In our creative responses, are we, in some cases, reinscribing the same disappointments we’re trying to reconcile? Further, what comes after the problems get addressed? What happens if, when, and after a collective consciousness has been awakened?


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


* * *

I had these and other questions watching Slave Play, a three-act satire that ran until January 13 at the New York Theater Workshop (I also heard whispers that it could be headed for Broadway). Director Robert O’Hara and playwright Jeremy O. Harris — a student at Yale’s School of Drama, and one of New York Times Style Magazine’s Black male writers of our time — imagines a world that, once fully revealed, looks very much like our own. Yet, we don’t know that at first. We see, instead, three interracial couples engaged in “slave play,” or sexual acts meant to simulate the race, gender, and class dynamics of antebellum America. Disorienting details hint that something is askew. The slave woman twerking on the floor to Rihanna while cleaning; the mistress twitchily summoning a tall, light-skinned fiddler to her bedside; the Black overseer crying frustrated tears through pleasure as his white indentured partner licks his boots. It titillates, it makes us (some of us, mostly the white folks) laugh. It, thankfully, ends quickly, giving way to a modern-day scene that sends up a certain kind of east coast, academic, therapeutic language, the language of our “moment,” to hilarious effect. It turns out the three interracial couples are all in therapy because the Black partners can no longer feel sexual pleasure in their respective relationships. And true to real life, the white partners (or those with closest proximity to whiteness) are emotive, externalized, and sometimes vocally annoyed, while the Black partners, for much of the time, simmer, stunned and silent.

All the actors play to some level of humiliation, but the Black woman in the therapeutic experiment, Kaneisha, played with a convincing prickliness by Teyonah Parris, seems to get especially short shrift: face down, she eats a busted cantaloupe off the floor in the first act, and by the third act, exorcises some trauma when her formerly petulant partner agrees to call her a “nasty negress” while they’re having sex. “Thank you for listening,” she says after the word play turns into several minutes of vigorous fucking.

The ending is an unsettling, confusing affair. I wasn’t sure if a rape had taken place or if it was, instead, a “breakthrough” achieved through consent. At any rate the labor of Parris, on whose character arc the entire show builds its human core, stayed heavy on my mind for days.

“I don’t want people to be able to walk away from a play about slavery and say, ‘Oh, well, that’s not about 2018,’” Slave Play’s playwright told an audience of donors, according to a Times profile. But who, exactly, doesn’t notice that the reverberations of slavery are still with us? If we’re really trying to wake up white people, I wish folks would say that. Slave Play’s Black cast members likely had to do heavier lifting — physically and psychically —  than the white (or white-ish) cast members in reimagining scenes drawn from America’s slave past. Do these interventions even work? And if they do, at what cost— to the audiences who may be harmed? To the cast and crew?

* * *

The politics of pleasure are as ripe as any place to dig, for creative play, for exploration and elucidation, mapped as it is into the subconscious, and there’s a legacy of its exploration in the work of Frantz Fanon and Adrienne Kennedy, both apparently influences on Slave Play’s playwright. The goal is to unsettle, to probe, and I can get with that, up to a point. What about context[1], interiority, reflection within the fictive universe of a piece? Maybe more of that would have been helpful in constructing Kaneisha as more than a spectacle. She speaks a lot, especially in the third act, but mostly, her character is seen through the eyes of her partner, as she talks about herself in relation to him and other white people from her past.

Even a journalistic endeavor could be improved with an ethics of care. In the six-part docuseries “Surviving R. Kelly,” which aired January 3-5 on Lifetime (and is still available on demand), the drama of Kelly’s victims’ pain is the main event, drawn out  for the benefit of the collective consciousness. I was well-acquainted with the story, yet still not entirely prepared for the grotesque details I saw and heard.

The series has already brought what feels like a shift: a lawyer for one of the families accusing Kelly confirmed that senior investigators from Fulton County, Georgia interviewed his client. The state’s attorney in Cook County, Illinois has asked for victims to reach out. There have also been costs: survivors featured in the documentary have been doxxed, discredited, and disparaged online. I saw it in my own feeds, from people in my own family. I’ve seen Black women, unaffiliated with Kelly, report they’re “not ok” and had difficulty sleeping after watching or talking about the series. In the series, some survivors were visibly traumatized during their interviews. (Watching Asante McGee revist a room she recalled being held captive in reminded me of a question from In the Wake: “Where is the breaking point, the breath, the pause…?”) How, really, should you manage when confronted with the truth of just how vulnerable you are?[2] More context could help. The music industry has a history of sexually exploiting underage girls—critics Ann Powers and Nelson George explain this powerfully in the series— but so does, specifically, the tradition of Black music upon which Kelly built everything. He’s a hip-hop generation misogynist who learned from his peers and from soul music forebears like Marvin Gaye and Al Green and James Brown, all of whom have allegations from harmed women tainting their legacies. Black Gen X-ers didn’t handle R. Kelly before because their forebears didn’t handle their own.

In Feeling Backward: Loss and the Politics of Queer History, Heather Love writes, “For groups constituted by historical injury, the challenge is to engage with the past without being destroyed by it.” Audiences and creators ask a great deal of people when they’re digging into the past, probing around the depths of ancient and not-so-ancient traumas. If the moment requires that the confusion of the present and the pain of the past get served up with realistic viscerality — if it’s about more than being current, and more than just theatre — special care should be taken with the subject matter as well as the casts, sources, and audiences most likely to be impacted.  

* * *

* * *

[1] On January 14, 2019, Jonathan Square of the digital humanities project Fashioning the Self in Slavery and Freedom published a syllabus to help with processing Slave Play.

[2] Girls for Gender Equity and Black Women’s Blueprint produced and published reading guides and community toolkits for “Surviving R. Kelly.”

Take Script, Add Snow

Still from Christmas in Evergreen. Front Street Pictures / The Hallmark Channel / Getty Images / Composite by Katie Kosma

Jane Borden | Longreads | December 2018 | 12 minutes (3,211 words)

In a big city, a woman lives a fast-paced life until something forces her to visit a small town, just before Christmas. Shortly after arriving, she connects with a charming small-town man. Commence ice-skating, hot chocolate, a tree lot, tree decorating, caroling, gift giving, charity work, big family meals, snow, snowmen, snowballs, snowball fights, red scarves, cookie decorating, a grand old house or country inn, sleigh bells, giddy children, and the soft plucking of stringed instruments whenever a character delivers a joke.

Every made-for-TV Christmas movie tracks the above plot. And yet, the uniformity does not prevent proliferation: This year alone, Hallmark made 38 holiday films across its two channels. Lifetime made 14. “The word is insatiable,” says Meghan Hooper, senior vice president at Lifetime and Lifetime Movie Network. “We don’t seem to be able to do enough to make the audience happy.”

“Suddenly, Hallmark is no longer a guilty pleasure, it’s just a pleasure,“ says writer-director Ron Oliver, who has made 13 Christmas movies for TV since 2004, mostly for Hallmark, which has become synonymous with heartfelt, holiday romantic comedies (it’s the Xerox of them, or, if you will, the Kleenex). “I have not seen this happen until this year. Everybody jumped on board.“

In 2017, 83 million people watched at least one Hallmark Christmas movie during their Countdown to Christmas and Miracle of Christmas events. The Hallmark Channel was last year’s number one cable network among women 25 to 54 in quarter four, and is shaping up to remain in that spot for 2018. Both Hallmark and Lifetime boast double-digit ratings increases during December. And the field is getting crowded: UPtv produced seven original holiday movies this year, Netflix made four, and Freeform made three.

How did America become obsessed with sappy, predictable, low-budget Christmas movies? Before we delve into history and psychology, let’s finish the plot:

Our protagonist falls in love, of course. She also rights an ethical wrong, always something from the past awaiting resolution and absolution. “It all goes back to Dickens,” Oliver says, referencing Scrooge’s transformation in A Christmas Carol. Like Scrooge, our modern-day heroine receives help, her three ghosts being her new love interest, the small-town community, and a stranger who slightly resembles Santa (what Hooper calls “a Santa-like”). At the film’s end, she and the love interest kiss, usually for the first time (these are family films), and she either moves to the small town or is forever changed by it. No one asks why the moms and aunts look only 10 years older than the protagonist. Roll credits, start the next one, begin to confuse which characters are in what.

It’s easy to assume that viewers enjoy these movies in spite of the repetitive plotlines, as if the networks greedily scam us. But Hallmark and Lifetime both do extensive focus grouping and ratings analysis. They know what works — we watch these movies because the plots are the same. In fact, Oliver calls the plots peripheral: “The real elements of these movies that make people love them is this sense of returning to your own past, your own childhood and sense of innocence from that era.” The slightly varying setups and environs must be similar to deliver us. These films are not art or even entertainment — they serve a function. They are ritual, a ritual as pagan as Christmas’s origins. And their key piece of iconography is the kind of American small town that’s quickly disappearing.

***

IMAGINARY AMERICA

“Small towns have always been the iconic image for human relationships,” says Krystine Batcho, a professor of psychology at Le Moyne College, whose work specializes in nostalgia. “The older woman lives here and the nice young family lives here. When you present the imagery of the small town, it’s portraying how we can get along in peaceful ways and help one another.”

Oliver grew up in one of these towns. He recalls, “It was this absolute Norman Rockwell Christmas town.” However, he adds, “I went back, maybe 20 years ago, and it is now strip malls. There is nothing to hold onto from there, so you hold the memories and recreate them in stories.”

He uses his hometown as a template when designing scenes and sets, but admits a challenge: “A few places exist, but they are getting harder and harder to find. We have to make them. We use every trick in the book.” Christmas Everlasting, one of this year’s Hallmark films, is partially shot in Covington, Georgia, because Oliver was drawn to its charming town square. “But when you go two blocks from there, it’s Walmart and CVS,” he says. Ultimately, the setting of the film is an amalgamation of three different small towns, plus a healthy dose of CGI.

Suddenly, Hallmark is no longer a guilty pleasure, it’s just a pleasure.

So these movies deliver a fantasy of a memory — except, for most of us, it’s a false memory we internalized through Norman Rockwell art, and Rockwell was also delivering the fantasy of a memory. You can trace the line of American Christmas imagery all the way to Queen Victoria: In 1848, The Illustrated London News published an etching of Victoria and Prince Albert standing around a Christmas tree with their family. It was published two years later in the States, and had a lasting impact of popularizing the tradition. “Christmas was not always a family-centered celebration,” explains Bruce Forbes, professor emeritus of Religious Studies at Morningside College, and author of Christmas: A Candid History. It became a family holiday in the second half of the 19th century, thanks to the wild popularity of Queen Victoria, and to Charles Dickens, who published A Christmas Carol in 1843. “Dickens was not telling you what was happening in England, he was trying to create a Christmas that didn’t exist yet,” says Forbes. “When we talk about the ‘spirit of Christmas’ now, we talk about generosity. That’s a Dickens creation.“

What did exist prior? In England, not much. As a result of the lasting effects of the Puritan revolution in the 1600s, the English hardly celebrated Christmas at all. The Puritans’ beef was twofold: Christmas was not celebrated by early Christians (the holiday didn’t appear until the 300s) and those who did celebrate Christmas, Forbes says, “went to midnight mass and then to the tavern and got drunk.” So Puritans wiped it from the English consciousness.

The Illustrated London News (1848)

Taking the baton from the Victorian image was Currier and Ives. The phenomenal success of this New York City lithography firm in the mid- and late-19th century put affordable prints of snowy landscapes into the hands of a nation. Then in the 20th century, Norman Rockwell paired nostalgia for 19th-century Christmas with images of our mid-century obsessions: the nuclear family and suburban life. It was a powder keg. Today, pulp-like TV Christmas movies recreate these images again — in their own way as prolifically as Currier and Ives — but this time they’re more reflective of our modern world. Additions include both the mundane (texting) and the imperative (finally, after years of criticism, we see characters of color). Nostalgia is meta by nature.

***

ART IMITATES ART

A few years ago, Hallmark Cards, Inc., the parent company of the two Hallmark networks (aka Crown Media), tapped one of its senior illustrators, Geoff Greenleaf, to create a series of images about a fictional town called Evergreen. The series of snowy scenes in a small town, featuring quaint shops and an iconic vintage red truck, became a bestseller. In response, Crown Media turned the cards into a 2017 film titled Christmas in Evergreen. The movie, and its 2018 sequel, Christmas in Evergreen: Letters to Santa, were shot at Burnaby Village Museum in British Columbia, itself a fictional setting designed to preserve and romanticize small towns of yore.

Dickens was not telling you what was happening in England, he was trying to create a Christmas that didn’t exist yet.

Oliver believes the mid-century American imagery that these films capitalize on is so effective because it speaks to a time “when America was truly powerful and firing with all six cylinders: making great cars and great music, going to the moon, for crying out loud.” But, of course, Rockwell and his ilk rarely painted the whole picture. “It’s sanitized,” says Forbes. “It ignores all kinds of things: race, teenage pregnancy, poverty. But it was the image that white Americans had of themselves.”

TV Christmas movies have certainly perpetuated this brand of whitewashed nostalgia. As the films’ popularity rose, networks received ample criticism. Still, just five of Hallmark’s 38 holiday films this year feature leads of color. This includes Christmas Everlasting, starring Tatyana Ali, who also stars in this year’s Jingle Belle on Lifetime. “It speaks to a shift in our culture, that suddenly there is a move afoot to have more and more of our real world look like our television world,” says Oliver. Still missing in the genre are LGBTQ love stories; no lead to date has been gay. 

As American as a proclivity towards heteronormative whitewashing, so too is the tendency towards consumerism — some of the imagery within the films is for sale. Christmas in Evergreen has an adjacent product line: a keepsake ornament of the red truck, a magic snow globe, a mystery key, Santa’s mailbox. “We worked in partnership to look at a couple of the products they had that we could weave into the overall storyline,” says Michelle Vicary, executive vice president of programming and network publicity for Crown Media Family Networks, which owns the Hallmark Channel and Hallmark Movies & Mysteries. And of course they did: Hallmark was a retailer long before it got into content creation. “Not all brands evoke emotional connections,” Vicary says, “but that is at the top of what this brand promises and it has been since the beginning.”

***

CAPTIVE ON THE CAROUSEL OF TIME

These films are not merely delivering the past; the stories achieve a delicate balance between familiarity and novelty. In them, we see a “constant struggle between wanting to hold onto certain things from the past but wanting a new beginning,” says Batcho. That new beginning is provided primarily by the love interest — TV Christmas movies are never not romantic comedies. Hooper says her team at Lifetime learned the importance of adding a romance element, after trying other versions of holiday films without it.

This makes sense to Batcho. “Like nostalgia, romantic stories focus on relationships and the sense of the ideal,” she says, adding that early brain imaging studies suggest that romance and nostalgia produce similar hormonal releases in the brain, “the loving, feel-good, pro-social feelings.” Plus, of course, as important as it is for us to protect the village, we need new relationships to strengthen the gene pool. But Batcho also says novelty is intrinsically linked to nostalgia. She likens cyclical markings of time, such as holiday celebrations, to a carousel: Each time it goes around, the horses look the same, but different people may be sitting in different places. “By being different and new, [novelty] allows you to escape so you’re not trapped in the past,” Batcho says. “Nostalgic people tend to be more optimistic, forward-looking people.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Although nostalgia was originally defined as homesickness and categorized as a disease, scientists now understand its healthy and profound psychological function. Studies suggest that exercising nostalgia enhances mood, reduces stress, and increases both social connectedness and self-regard. “Nostalgia helps us rediscover aspects of our authentic self, by going into our past,” says Batcho. Studies even suggest it helps us ascertain a meaning or purpose to our lives. Batcho also describes nostalgia as a social experience, since we identify ourselves in terms of relationships. “Nostalgia actually helps diminish loneliness by reminding people that, even if you are not physically with those who have loved you, you were once loved,” she says. Further, the process isn’t random. We specifically seek past memories that will help our current state.

“Every time you turn the television on,” Oliver says, “you’re seeing news about another betrayal of an ideal that America held close for a long time. I think the country is trying to find its moral center again. There’s a consistency to these stories that people hold onto like a life raft in the middle of a cultural storm.” Perhaps, as viewers, we also try to right a wrong.

The makers of TV Christmas movies wisely trigger nostalgia in several ways. “We certainly have gone after a certain type of talent, in terms of recognizable faces we grew up with from ’80s and ’90s shows,” says Lifetime exec Hooper. Nostalgia serves us. A 2018 survey by Cigna reports that most adults are lonely — as in, the average score, on a scale from one to lonely, was at least lonely. Other studies have shown loneliness to be a major predictor of poor physical health, leading some researchers to declare loneliness both a health crisis and an epidemic.

Further, cohorts aged 18 to 22 and 23 to 37 reported more loneliness than older generations. “That’s new,“ Batcho says. A press representative for Hallmark identifies the network’s demographic as women 25 to 54, but says that during the fourth quarter – our holiday season – “our women and adults 18 to 34 are through the roof,” suggesting a potential link between loneliness and viewership. 

***

WE ARE ALL JUST PAGANS BY A FIRE

Nostalgia TV has been booming for a few years now, and the seemingly endless reboots and remakes premiere all year long. So why this huge surge in viewers around Christmas? Even otherwise prestige-TV-obsessed viewers, who turn up their noses at predictable schmaltz, now indulge in made-for-TV Christmas movies. “I don’t know if the season causes it as much as the season gives you permission,” opines Oliver. “From Thanksgiving night onward, you are allowed to be sentimental.”

“It’s something about the holidays that is just built in: indulgence. Drink the hot chocolate, eat the food, lay on the couch, enjoy your family,” says Hooper.

And when Batcho is asked why people insatiably consume this kind of content during the holidays, she says, “Winter represents nature dying and taking a pause. It makes you feel very sad and hoping to look forward to a rebirth in the spring, which tells us that it is very fundamental and natural for people to like cycles. Bears hibernate. Even human beings need to take a pause.”

In one way or another, they are all saying the same thing, which is that we watch Hallmark around Christmas for the same reason Christmas happens at Christmas: the solstice. In the 300s, when the church designated the holiday, it likely chose December 25th for a litany of savvy reasons: some political and some for convenience, some building off already established pagan rituals. “You could guess them even if you didn’t study the cultures,“ says Forbes. “If it’s a midwinter festival, it would be a festival of lights to push back the darkness. It would feature evergreens because they look alive when everything else has died. To get past the isolation of winter, you would have feasts. And you would have dancing, singing, and drinking.“

From Thanksgiving night onward, you are allowed to be sentimental.

Part of why we’ve celebrated midwinter festivals since before recorded time is because, as Batcho said, we like cycles. They help us predict regular change: It’s cold and dark now, but abundant spring will come again, and we know it. They also help us deal with the constancy of change, with whatever on the carousel is new. “We can’t stop change. What do we do instead? Build cycles,” Batcho says. These cycles come in the form of temporal landmarks, which trigger nostalgia: birthdays, anniversaries, holidays … holiday movies. “We anchor ourselves. It is important for psychological well-being to have a sense that we are not out of control.”

The consistency of plot and its predictable ending therefore serve an important purpose: we need the films to be predictable because they are another icon of the midwinter festival. We see one and our brains not only know what to expect, but also what to do. If we seek this iconography now more than ever, then we must feel especially out of control.

***

THE FAST AND THE FURIOUS

It would be easy to attribute the popularity of these films to a kind of escapism resulting from the current division in our country, from the fear and hatred that many feel on both sides. But the rise of both Lifetime and Hallmark TV holiday movies started ramping up around 2012 (see graph). Perhaps the division in our country and the popularity of holiday films (and nostalgia programming in general) are effects of the same cause: an almost unfathomable acceleration of rates of cultural change.

In the 1980s, architect and inventor Buckminster Fuller (he of Dymaxion House and Geodesic Dome fame) posited a theory known as the knowledge-doubling curve. This sort of stuff isn’t 100 percent measurable, but the basic idea is: The amount of information we know, as a species, doubled about every 1500 years back when we were cavemen, and every 100 years in the modern era, up until World War I, at which point, it started doubling at an ever increasing rate. In the ‘90s, Artificial Intelligence researchers estimated the amount of information in the world doubles every 20 months. Today, varying estimates suggest that the amount of information in the world doubles every 10 to 13 months, and that, in our lifetimes, it could begin to double every 11 hours.

Change is coming at a furiously accelerating rate, providing us with greater and greater dominion. However, Yuval Noah Harari argues in his bestselling book Sapiens: A Brief History of Humankind, evolution did not equip us to handle this kind of rapidly increasing power. For millions of years, Genus Homo was positioned in the middle of the food chain. Only in the past 100,000 years did we jump to the top. “Humankind ascended … so quickly that the ecosystem was not given time to adjust. Moreover, humans themselves failed to adjust. Most top predators of the planet are majestic creatures. Millions of years of dominion have filled them with self-confidence.” Our genus, by contrast, stumbles along with far less than a majestic prowess: “many historical calamities, from deadly wars to ecological catastrophes, have resulted from this overhasty jump.”

We need help adapting to cataclysmic change. Nostalgia provides aid, and so does story. Researchers discovered that character-driven narratives cause the brain to release oxytocin, which can enhance empathy, thereby motivating cooperation and leading us to trust strangers. The study, led by Paul J. Zak at Claremont Graduate University, also found that we continue to mimic the actions and feelings of characters after the story ends. If a protagonist accepts shifts in her life and finds optimism for the future, then so may we.

We know that loneliness is on the rise. We know that both nostalgia and story help us create and sustain relationships. And we know that more than 83 million of us are turning to nostalgic stories during the annual month when humans anchor themselves against change, all during a time in history when the pace of change threatens to destroy us. Maybe we’re obsessed with schmaltzy TV Christmas movies because humans understand, deep down, that the real savior, this season and every season, is each other. Come on, you knew this article would have a Hallmark ending.

***

Jane Borden is a freelance culture writer based in Los Angeles.

***

Editor: Katie Kosma

Fact checker: Sam Schulyer

Copy editor: Jacob Gross