Search Results for: Paul Tough

Lumbersexuality, a Sport and a Pastime

Illustration by Homestead

Jonny Diamond | LongreadsJune 2019 | 22 minutes (5,308 words)

The sound is the first thing you notice, deep and hollow, burnished steel hitting chewed-up white pine. It’s not quite the warm, resonant thok of an axe in the woods, but the nearest forest of any significance is 50 miles up the Hudson River. This is Brooklyn, one very long bow shot from the Gowanus Canal.

It’s a chilly Monday night before Thanksgiving and Kick Axe Brooklyn is surprisingly full. Around two dozen people cluster in groups of six or eight across several “ranges,” tidily built versions of the old roadhouse bar-band cages, target at one end, party at the other. There doesn’t appear to be any flannel in the crowd (for now) but there are at least three reasonably grown-out beards in plain sight. One of the beards puts his beer down next to a basket of plastic Viking helmets and walks forward to pick up an axe from a squat round block of maple (each range has one of these blocks, to which the axe is returned after it is declawed from the wood).

Nobody pays much attention as he squares himself to the softwood target 16 feet away, holding the axe — specifically, an Estwing hatchet weighing about a pound and a half — with both hands and raises it above his head. Then, in a surprisingly fluid motion, he steps toward a faded red line on the floor and releases the hatchet in the direction of several concentric red and black circles painted on the wood, axe head over handle, where it strikes fast about six inches to the left of the bull’s-eye. He shakes his head, pulls the axe from the wood, and goes to collect his beer.

Scenes like this occur with increasing frequency in cities across North America, from Toronto to Austin to L.A., as axe-throwing clubs attempt to create their own niche and fill it, something like a laidback millennial bowling alley except with deadly weapons. For some, particularly since the election of Donald Trump, the physicality and latent violence of axe throwing has served a therapeutic purpose. As Megan Stielstra wrote in an essay last year for The Believer, “I threw axes throughout the fall, waking up every morning to new impossible cruelties. … I kept trying to pass the axe to my husband, but he wouldn’t take it. ‘You need it more than I do,’ he said from behind the yellow spectator line.”

Aside from its salubrious value the basic appeal of axe throwing is not complicated: Like bowling or billiards or darts, it is a way to give loose structure to any given social gathering. When I ask Kick Axe’s Nathan Oerstler if he’s ever had to deal with any drama among the beer-drinking axe throwers, the recently promoted “axe master” (up from “axe-pert” — there is no pun left unmade at Kick Axe, as the name might suggest) demurs, explaining that most of the axe-perts are comedians or actors — theater types — and serve as much as entertainers as they do instructors or referees: in short, they keep the people happy. Kick Axe opened in December 2017 and is more flannel-inflected theme park than bar, its employees communicating via headset about what targets need replacing, which axes need sharpening. This level of organization makes sense when you consider the hundreds of pounds of deadly steel flying through the air at any given moment, but axe throwing wasn’t always this professionalized: In fact, the origin of the axe-throwing social club is basically a bunch of bored Canadians in the mid aughts, standing around drinking beer and chucking hatchets at backyard waste wood.

As Backyard Axe Throwing League (BATL) founder — and one of those bored Canadians — Matt Wilson recounted, people kept showing up to throw axes in his backyard, so he had no choice but to grow. And so they did: The BATL, which has 10 locations in Canada, has since expanded into the U.S. with spots in Chicago, Nashville, Scottsdale, Houston, and Detroit. This unlikely success story has spawned competitors: Ontario’s Bad Axe now has 15 locations across the U.S.; the aforementioned Kick Axe also has locations in Philadelphia and Washington, D.C., and is opening more in Florida and Texas; and there are at least a half dozen independent axe-throwing venues across the country (including Massachusetts’s Half Axe, whose name heralds the end of the useful axe pun, or at least demarcates its nadir).

Whatever side of the border these clubs are on, most of them affect a shaggy, woodsy aesthetic, a little plaid here, some taxidermied animal there. One could say the same thing of many of their patrons, from Calgary to Orlando: red-and-black Buffalo check accenting high-cut oxblood Red Wings; gray chambray tucked into vintage denim; Carhartt jackets over Carhartt vests over old Woolworth’s shirts.

Most of the axe-perts are comedians or actors — theater types — and serve as much as entertainers as they do instructors or referees: in short, they keep the people happy.

This aesthetic — lumbersexual, which entered the mainstream vernacular in 2014, at a site called GearJunkie, and was just as quickly derided on Gawker and in The Atlantic — is certainly not limited to axe-throwing clubs (one could make the case that axe throwing as a pastime has arisen, inevitably, from the aesthetic). But as a loose set of fashion signifiers, lumbersexuality has been around in some form or another for a generation, competing with any number of the self-consciously vintage looks manifested in hipster culture.

As with so many of the aesthetic strands that make up any given tangle of contemporary style-consciousness, lumbersexuality’s origins can be found on the margins, one more example of straight culture borrowing heavily from gay culture, with half the commitment and none of the risk. Beards and bears and woodsy scruff have now fully entered the mainstream as the contemporary lumbersexual reappropriates the same tropes of classic American masculinity so long adopted and amplified in LGBTQ spaces. But even the original tropes themselves — of paternal strength and rugged stoicism — are products of male fragility.

As Willa Brown points out in the perfectly titled article “Lumbersexuality and Its Discontents,” the endless talk in the past decade of a crisis of masculinity is part of a long tradition in the patriarchal American imagination. In Brown’s oft-cited 2014 account for The Atlantic, the nostalgia-ridden aesthetic of the lumberjack has always been an outsize performance instigated by the insecurities of straight, white men, be it 1905 or 2005. But where Brown saw an imminent expiration date for the lumbersexual, it doesn’t appear to be happening any time soon.

As traditional hierarchies very slowly flatten into a more equitable distribution of power across society, the current crisis of masculinity is finding extended life in the backwaters of the internet. And while the real crisis of masculinity is male violence against women, the proliferation of pseudo-intellectual charlatans simultaneously seeding and harvesting the anxieties of young men for their own uses isn’t helping.

Male fragility isn’t going away. Nor is the flannel. Because there’s another performance happening here: different stage, same costume.

***

Back-to-the-land nostalgia has existed in the United States for almost as long as there’s been a United States, at various points manifesting as religious isolationism (think saucer-eyed Protestant sects one valley over), transcendentalist escapism (rich white guys reading poetry in the gloaming), and communitarian anti-capitalism. Its latest incarnation — rooted chiefly in an environmentalism that gestures at change through practice rather than policy — has been about bringing the virtues of the land back to the city, reimagining the frontier as urban rather than rural: a bespoke localism that animates everything from figurative fireside hobbies like pickling and needlepoint to larger-scale industry like rooftop farming, craft-brewing, and restorative, salvage-based building.

But in the same way the “frontier” of the 18th and 19th centuries was a romantic way of describing a slow genocidal war of settler colonialism, so too did gentrification’s border zones, through the mid 1980s to the late 2000s, serve as locations of displacement much more so than the idealized renewal imagined by urban planners. From its early days, gentrification was similarly romanticized with the language of westward expansion, those in its vanguard heralded as “settlers” and “urban pioneers.”

For good or for ill, these “pioneers” — comprised largely of artists in search of an affordable life in the city, abetted by canny real estate speculators — wore the mantle proudly as they built out semi-legal living spaces in (often but not always) sparsely populated post-industrial neighborhoods, sometimes squatting entire buildings. They were essentially homesteading — stealing power from the grid rather than rendering tallow, jury-rigging plumbing instead of digging wells — leading precarious DIY lives based on many of the virtues of the old frontier: resilience, independence, ingenuity, competence.

There was among this early, punk-inflected group of gentrifiers — buried under layers of rebellion and irony — a quiet reverence for working-class utility, often expressed in an aesthetic straight from their stepfathers’ closets: old beat-up boots, blue short-sleeve work shirts (bonus points for actual name tags), paint-spattered coveralls, and … flannel.

This commodification of rural life and labor feels, at best, like a post-industrial Instagram fantasy, personal branding available a la carte or by kit.

Much ink has been spilled on the mass-cultural half-life of flannel, but it wasn’t until the Seattle grunge scene exploded into the mainstream in the early 1990s — with a look that had begun with bands like Minutemen and Minor Threat a decade earlier — that flannel would achieve its high fashion ascendancy, showing up in collections by designers like Alexander McQueen and Vivienne Westwood and never really going away. The aesthetic and political interplay of these subcultures — gay, punk, DIY — would continue through the early 2000s as a youth culture raised on environmental angst looked further into the past for alternatives to the increasingly apparent cruelties of late capitalism, withdrawing to a kind of privileged moral quiet room in the handmade, the local, the slow.

Here then was a hardworking, readymade look, an identifying aesthetic with a notional connection to virtues of self-sufficiency, sustainability, the wild, and, if not out-and-out Luddism, at least an appreciation of analog competence.

But what happens when the performance overtakes the performer, when the flannel habit intensifies from urban axe throwing to rural woodcraft? What happens, in other words, when you finally buy an axe?

Well, it depends on the axe — and the performer, for that matter. If you’re Justin Timberlake, in his Man of the Woods era, the axe in question comes with a private Montana “ranch.” Timberlake, who grew up in suburban Memphis, has lately been performing a return to nature, (nature in this case being the exclusive 15,200-acre Yellowstone Club, a 21st-century millionaire land rush catering to those who want the gated community without having to see the gates). The streamable georgics resulting from this relocation — manifested as the 16 tracks on his February 2018 album, Man of the Woods — reveal little of Timberlake’s relationship to the actual woods (or mountains or fields or wilderness) and present more like a checklist of urban-versus-rural cliché, the kind you might find in the playbook of any halfway decent political operative aiming to divide and conquer. Here are some lyrics from the album’s seventh track, “Supplies”:

’Cause I’ll be the light when you can’t see

I’ll be the wood when you need heat

I’ll be the generator, turn me on when you need electricity

Some shit start to go down, I’ll be the one with the level head

The world could end now, baby, we’ll be living in The Walking Dead

Translation: My hard-won know-how (money) will save us when the poors run out of stuff. (Also, a cavil, but one doesn’t “turn on” a generator like a lamp, one starts it like a lawnmower … and “start me up” would have worked here!) In track 11, titled, naturally, “Flannel,” he sings:

Right behind my left pocket

That is where you’ll feel my soul

It’s been with me many winters

It will keep you warm

Ooh, here’s my flannel

The character’s in the way you wear it

Translation: I wear grandpa shirts and grandpas are good guys. Then, on track 14, “Living Off the Land,” we hear that:

You have to be comfortable with yourself

because that’s all there is

There’s you and nature

Soon as you think you got it all figured out, you know,

the wilderness will figure some way to teach you a lesson

As I’m alone in the forest, I’m one with my surroundings

and there’s a lot of peace in that solitude

I’ll be a mountain man ’til the day I die

 

(Living off the land)

And I break my back

And I work all night

[. . .] I’ll be damned, sometimes it’s hard,

the backed-up bills on the credit cards

Translation: One time I got a little lost on the way to Bill Gates’s cookout. It was tough. And these are the more thematically substantial tracks!

One might find more insight into how the Big West has rubbed off on the Big Pop Star with a quick look at the wilderness-adjacent merchandise from the Man of the Woods Collection, one item for each of the album’s tracks. These include nods to practical Americana like a wool Pendleton blanket, a tin of beard butter, and a trucker vest; objects from the collection that correspond to the tracks above are:

Track 7: A strongbox

Track 11: A flannel shirt, obviously

Track 14: A Best Made Co. felling axe, with custom-painted handle

These items, along with a cooler, a jean jacket, a bandanna, and more, were all available for sale at a Lower East Side pop-up shop the week the album was released, a kind of company store for Timberlake Inc.

As brother to a trucker and an actual lumberjack, it is hard for me to fully understand totems of daily labor so dramatically upsold to “influencers” under the banner of authenticity. But as obvious a target Timberlake is for derision, he’s more of a symptom than he is a cause, one more in a long line of mythologized white men, from Paul Bunyan to John Wayne, out there taming the wild as they tame themselves (but not too much), spokesmodels in the endless ad campaign for America that began with Horace Greeley telling us to go west and live off the land.

And that’s the dream we’re still being peddled, embodied by the upsold axe. That the axe in question is hanging on the wall of a pop-up store in downtown New York creates a particular kind of dissonance: Timberlake Inc. is almost too perfect a microcosm for the stylized repackaging of the outdoors, for the yearning after a frontier that never really existed and the rural “working-class” sensibilities that accompany it. This commodification of rural life and labor — its ruggedness, its whiteness — feels, at best, like a post-industrial Instagram fantasy, personal branding available a la carte or by kit; at worst, it perpetuates pernicious stereotypes, both racist and classist, about natural purity and rural misery, a paradox in service of the powerful.

As brother to a trucker and an actual lumberjack, it is hard for me to fully understand totems of daily labor so dramatically upsold to ‘influencers’ under the banner of authenticity.

But life adjacent to wild spaces — and the work that sustains it — can be good, regardless of your politics. The braiding of masculinity and wilderness is as old as the American frontier, but it’s worth considering how we might untangle the two, worth considering how we might live with the forest world — and all it has to offer us — without destroying it.

***

But maybe you’re not a rich, world famous pop star with a flannel fetish (if you’ve read this far, it’s likely you are not). Sure, axe throwing seems like a fun thing to try, but lately you’ve been spending more time upstate (whatever state that might be) car camping, or staying with friends who’ve left the city; there are campfires, fireplaces, wood to be chopped, logs to split. You are thinking of buying an axe of your own.

Where to start?

There are three basic types of axes you might acquire: a hatchet, for light camp use limbing branches and making kindling (12 to 18 inches long, around 1.5 pounds); an all-purpose camp axe for clearing saplings and light splitting (20 to 28 inches, around 2.5 pounds); a felling axe for chopping down trees (30+ inches, between 3 and 4.5 pounds). Within each of these basic categories there are dozens of varieties, based largely on the regions from which they originate: the Allagash Cruiser, the Hudson Bay Camp Axe, the Dayton Railsplitter, etc.

Whatever you’ve chosen, the first thing you’ll notice is the weight: a multipurpose Swedish forester’s axe — weighing three pounds — is a manageable tool, useful on smaller trees and for light splitting. You’ll probably pick it up by the end of its American-hickory handle using your dominant hand. If you’re lucky, it comes to you as an already well-used and well-loved tool, the wood worn to a tacky smoothness by years of sweat and sap and the occasional reapplication of linseed oil. It will feel heavier than three pounds should.

Next, you’ll probably hoist the heavy end up into the other hand, striking a slightly awkward pose halfway between lumberjack and serial killer.

Perhaps the light will catch the burnished cheek of the blade, and you’ll reach a tentative finger to the hardened edge, which, if properly sharpened, can dry-shave the hairs from your arm. You’ll continue to feel that weight, three pounds starting to feel like 30, and you’ll begin to wonder: What can I chop with this? The axe is one of the oldest tools we have, designed, essentially, by gravity (which does most of the work anyway) — when you pick it up, you’ll want to let it fall.

Let’s say you’re in the woods — on a weekend camping trip or at a friend’s woodsy cabin — so there’s a lot it could fall on. For a first swing, a nice, newly down log is good for practice — in a wild forest, there should be plenty of recently downed deadfall not yet rotten.

You stand square to the log — imagine it as Eastern red cedar, for its intense scent and lurid scarlet heartwood — and raise high the axe. The weight will do the rest. If the swing is true, there will resonate from the tree — through still-growing sapwood to the compressed cells of the dying core — a deeply satisfying, percussive boom, scattering birds and startling deer. The first swing invites another, and then another, until a deep ringing rhythm echoes through the forest. It’s hard work, but in its repetition it is meditative.

That sound, of axe on wood, calls back to a hundred generations of humankind, invites considerations of how our ancestors might have understood their place in a world covered by forest. Sitting there, axe across knees, taking a breather, it’s not so hard to imagine them.

Shaggy Briton woodsmen in the vast pre-Roman forests of Cumbria, gripping their sacred Langdale axes, with glimmering heads knapped from the rare volcanic greenstone mined from the Pike of Stickle.

A barefoot Japanese carpenter moving gingerly across a hinoki cypress swinging his heavy, long-handled masakari, leaving palm-size chips of wood as a massive six-by-six beam reveals itself from the 16-foot log.

A pair of Basque foresters, generations ahead of the chainsaw, laboring astride two great beech trees pulled from deep within the Irati Forest, locked in a traditional aizkolaritza, a village-wide test of strength, precision, and endurance to see who might hew the finest, fastest timber.

Tireless Henderson Islanders squaring off Pacific rosewood, adzes made from giant clamshells, chewing out chocolate shavings from the dark heartwood. 

A thousand miles and a thousand years separate these moments of labor, and at the heart of each, the same basic motion: Pick up the heavy thing and let it fall; let the weight do the work, or at least half of it.

This is the allure of the axe: It is a simple, efficient tool charged with power and violence; it lets us measure our labor swing by swing, as we gather fuel for heat or timber for shelter. To look at a stand of trees, axe in hand rather than chainsaw, is to understand it not as a resource for the coming weeks or months, but for subsequent years and generations. And though the axe confers an intoxicating dominion, over woodlot and wood target both, it is a tool that invites a way of seeing that is very old indeed. The various eras of human prehistory seem named for dynastic families from alien worlds — the Mousterian, the Denisova, the Aurignacian. It is the Acheulean in which early stone hand tools begin to flourish, particularly what is now referred to by paleoanthropologists as the “hand-axe.”

The Acheulean “hand-axe” is not an axe in the modern sense; really, it’s just a big rock with two chipped-off edges, bits of flint “knapped” away to create a biface the better to dig or cut with, to remove bark from a tree or, even, to fell that tree by hand. Perhaps, also, the better to kill with, human history providing no shortage of reminders that any distinction between tool and weapon derives from delusions of civilization. 

The finer specimens of these hand-axes, unearthed across Europe and Africa, from the Fells of Cumbria to the river gorges of the Olduvai Valley, have the shape of great and heavy tears. For centuries, British farmers, turning one up with plough or spade, thought of them as thunderstones, specially formed rocks either dropped from the heart of terrible storms, or seeded deep beneath the earth by lightning strikes, gifts of creation, that man might make better dominion of a world made just for him. 

Hand-axes represent the evolution of a very basic technology, and one can imagine that moment when the blunt rock was discarded for the edged rock, followed quickly by the thought, in not so many words: “What if I made this even sharper?”

And so these rough-hewn stones-as-tools, ranging in size from an iPhone to a toaster, underwent refinement over scores of generations — and with that refinement toward balance and symmetry, they began to take on value, both material and spiritual. Hand-axes, their abundance and quality, became a symbol of wealth, a currency; and those created from rarer elements (the deeper in the earth the better) were revered as religious symbols, not to be used as tools, but rather thought of as we now think of art. As French paleoanthropologist Andre Leroi-Gourhan puts it, in contemplating the unlikely craftsmanship of such early humans:

It seems difficult to admit that these beings did not experience a certain aesthetic satisfaction, they were excellent craftsmen who knew how to choose their material, repair defects, orient cracks with total precision, drawing out a form from a crude flint core that corresponded exactly to their desire. Their work was not automatic or guided by a series of actions in strict order, they were able to mobilize in each moment reflection and, of course, the pleasure of creating a beautiful object.

Though Gourhan is writing about human beings 10,000 years ago, he could be describing a certain strain of contemporary axe maker, for whom an axe is just as at home on a pristine West Village gallery wall as it is in the back of a woodshed.

About a decade ago, Peter Buchanan-Smith, a Canadian designer living in New York City, found himself in need of a hatchet to make some kindling. Looking to grill a choice cut of meat over a hot, wood-fueled fire, Buchanan-Smith found himself unimpressed by the cheap, poorly made imports at nearby hardware stores (dull edges, synthetic handles), so he expanded his search for a better, American-made tool.

The story might have ended there, but shortly after Buchanan-Smith finally did get his hands on a decent axe, he decided to customize the handle in colorful stripes: and just like that, the Best Made Co. was born. (Buchanan-Smith declined to talk to me for this story and is, I’m told, transitioning away from the company.)

Things happened quickly from there. Buchanan-Smith, who’d won a Grammy for his art for a Wilco album cover and who’d done design work for Isaac Mizrahi and David Byrne, was well known among New York’s art and design community, and very soon after the first axe was painted, it was hanging on the wall of Partners + Spade in Manhattan. That was in May 2009; a month later, in anticipation of Father’s Day, the fledgling brand sold out its stock (100 axes) in an hour.

The past decade has been a good one for Best Made Co. with the opening of a flagship store in lower Manhattan, followed by a 2,700-square-foot showroom in L.A.; and on top of their apparent domination of the bespoke axe market, the company has gone all in with a full line of forest-forward gear and apparel. So, if anyone has a full view of the aesthetic arc of lumbersexuality, it’s Buchanan-Smith, who’s described his ideal customer as “Alaskan Charles Eames (rather than Brooklyn Grizzly Adams).” And while someone who relies on tools but also likes good design is certainly cooler than dresses up like someone who relies on tools, it helps that the former usually has a little more money to spend than the latter.

One might wonder how great the difference could be possibly be from one axe to the next, but it only takes an afternoon at the wood pile to appreciate good steel as opposed to bad: the former holds its shape longer, has a stronger edge, stays sharper, and is less prone to chipping or breaking, all of which makes for a safer, more efficient axe. It is taken for gospel — at least on the internet of old guys and their tools — that the older the axe, the better the steel.

You are thinking of buying an axe of your own. Where to start?

If you’re looking, it’s not hard to find someone in just about every rural county in the country with a grinding wheel, a set of files, and a strop, who will take your grandfather’s axe and return it to its former glory. And for every one of those guys there are a hundred others hanging out in online forums asking one another the best way to rebevel the edge on a timber-hewing broadaxe or how to de-pit the cheek of a 100-year-old New Jersey pattern felling axe. (To its credit, Best Made’s L.A. store has a counter devoted to restoring and refurbishing old tools, from cast-iron pans to axes.)

Navigating sites like BladeForums.com and TalkBlade.info, a theme begins to emerge: New, mass-produced things are bad; old, handcrafted things are good. And while there’s an awful lot of grumpy conservatism burbling through these forums, spiked with a mild dose of over-the-counter libertarianism, if you squint past the bumper-sticker usernames and shallow isolationism, the underlying politics run parallel to much of the contemporary green movement, from the embrace of all things local to a rejection of late-capitalist disposability. Granted, from the conservative direction these politics are rooted in a nostalgia that veers into apocalyptic nativism, but it is bewildering to see how similar in outlook — when it comes to craftsmanship, consumerism, conservation — so many people are who otherwise identify with different ends of the political spectrum.

***

Politics doesn’t come up much at my return visit to Kick Axe for the opening of spring league night — it’s likely that the ideological spectrum here is similar to any Brooklyn bar on a Monday evening, which is to say not as liberal as Twitter would have you believe. I sit back and watch 76 amateur axe throwers crowd around league master Anthony Oglesby, who stands upon a stump introducing new rules and reminding competitors of the old, part carnival barker, part vice principal.

There is more flannel in this crowd than the last time I was here, more self-conscious woodsiness expressed through beards and boots, so I’m not exactly sure where Melanie Serrapica fits in. In her late 20s, Serrapica is wearing a semiformal low-cut red dress, and if it weren’t for the custom-painted hatchet she holds lightly in her right hand, its handle a gradient from lustrous black into midnight blue, I’d assume she’d entered the wrong bar.  

“[Axe throwing] is a great way to blow off steam after coming from work, where you want to throw things at people but aren’t allowed,” Serrapica deadpans, despite having to yell over the anticipatory din of her fellow axe throwers. Her friend Sara Morabito nods in agreement. “We’re two nerds who don’t do things other than conventions,” she says, gesturing to her fiancé Chris Knowles. “This was the first athletic thing where we were both like, ‘We’re really good at this.’ It’s a great thing to do together.”

Like Serrapica, Morabito and Knowles fell hard for the pleasures of axe throwing, and also have their own custom axes (hand-painted by fellow league member, Tommy Agniello) — unlike Serrapica, they have yet to name their axes. “Yeah, I named it Axe-Po,” Serrapica says. “You know, like B-MO from Adventure Time?” (I don’t.) As the subject turns to axe care and sharpening technique, I ask the trio why they think axe throwing has become so popular. Chris (who favors a double-grit sharpening puck for maintaining his blade) gets to the heart of it: “It’s something that feels masculine and outdoorsy, and I think people are looking for that.”

This is the allure of the axe: It is a simple, efficient tool charged with power and violence.

 

You don’t need a gender studies degree to understand that ideas of masculine and feminine exist on a spectrum that doesn’t map across a male-female binary; in fact, the league crowd is as diverse in gender as you’d expect of a bar in Brooklyn on a Monday night. As I circulate among teams with names like Inside the Axer’s Studio, Axes of Evil, and Well, Axetually, interrupting people as they get in a few more practice throws before the competition starts, one name keeps coming up: Rebecca. The best. Unbeatable. Rebecca is the best axe thrower. “Number one last season, and the season before.” Nobody knows if she’s coming tonight, nobody seems able to spot her or her girlfriend in the crowd. Someone thinks she might have moved upstate, “to be closer the woods,” and I can’t tell if they’re fucking with me. She’s already a legend, the more so in her absence.

People are drinking — each league night has its own beer sponsor — and it gets noticeably louder as the new season begins, the title wide open and up for grabs in this new and Rebecca-less reality. Soon into it I notice a woman pressing a call button next to her range, an intense look on her face: It’s too early for a wood replacement on the target, so she’s looking for a judgment. An axe-pert calls the league-master over, and all parties approach the target, like lawyers approaching the bench, to peer and point at an axe stuck just off the bull’s-eye. League-master Anthony waves over at Kick Axe’s manager, Nic Espier, who, with his suit and his earpiece looks like he’d take a bullet if ordered to, goes over to settle the issue.

“Seven points decided last year’s title,” he tells me, after judging in favor of the button-pusher. “These guys look like they’re having fun, but they take it pretty seriously.”

The pleasures of axe throwing or wood splitting or tree felling aren’t for everyone — nor, indeed, are they available to most. But it would be a shame to dismiss these things we yearn for — open spaces, wilderness, a particular kind of labor — simply because we’ve had them so relentlessly repackaged and sold back to us.

So let the axe be many things — tool, work of art, diversion — but let it also be a way back into the forest. Let this very old machine remind us of our limits and show us not what is ours to use, but ours to preserve.

***

Jonny Diamond is a writer and editor who splits his time between New York City and the Hudson Valley. His fiction and nonfiction has appeared in The Missouri Review, Geist, Hobart Pulp, Rolling Stone, Literary Hub, and elsewhere. He is currently working on a book-length object history of the axe, part investigation of its symbolism in America’s westward expansion, part interrogation of contemporary tropes of masculinity and wilderness. He is the editor-in-chief of LitHub.com

Editor: Kelly Stout
Fact checker: Ethan Chiel
Copy editor: Jacob Gross

The Gymnast’s Position

Illustration by Homestead

Dvora Meyers | Longreads | June 2019 | 25 minutes (6,257 words)

More than two decades ago, a billboard went up in Salt Lake City near the 600 South exit of the I-15. It featured a young woman in repose clad in a sleeveless black leotard, her back to the viewer and her head tilted up. The weight of her upper body rested on her right arm, which was extended behind her; her left arm lay languidly on her bent left knee. Her right leg was extended straight in front of her, its foot arch, creating the appearance of a straight line from hip to toe.

The angle of the woman’s head seemingly bathed her face in light, her long curly blonde hair falling freely down her neck. The pose was reminiscent of Adam on the ceiling of the Sistine Chapel, only inverted.

Passersby unable to make out the words printed in small text beneath the image would be forgiven for not knowing what exactly the billboard was advertising. Was it selling a dance performance or was it an ad for workout apparel or a photography exhibit at a local gallery? Visually, there were few clues.
Read more…

I’ve Done a Lot of Forgetting

Getty / Illustration by Homestead

Jordan Michael Smith | Longreads | May 2019 | 10 minutes (2,744 words)

If someone spits bigotry at you while you’re a kid, you’re unlikely to forget it. You’ll remember it not because it’s traumatic, though it can be. You’ll remember it not even because it’s degrading and excruciating, though it is certainly those things, too. No, you’ll remember it because it instills in you an understanding that people are capable of motiveless evil. That humans can be moved to hate because they are hateful. You aren’t given a reason for why people hate you, because they don’t need a reason. You’re you, through no fault of your own, even if you want desperately to be anyone else. And that’s enough.

I am a Canadian. I was born in Markham, which is a small city about 30 kilometers northeast of Toronto. That distance meant a great deal. Markham was a large town of middle- and working-class families when my newlywed parents moved there, in the late 1970s, with a population that hovered around 60,000. It was pretty mixed demographically, I recall, though containing a white majority. My older sister and I were the only Jews in our elementary school, except for one other family who arrived after we did and seemed not to attract much ire; I imagined it was because they were beautiful and popular (we were neither).

We were one of the minority of Canadian Jewish families living outside Toronto or Montreal. More than 71% of all Canadian Jews reside in these two cities, according to Allan Levine’s serviceable but unexceptional new book on the history of Jewish Canada, Seeking the Fabled City. Levine describes a familiar story of an immigrant group gradually gaining acceptance (and some power) in a once-largely white Christian country. For the first half of the 20th century, Jews in Canada were arguably detested to a greater degree than in America. By the 21st century, Canadian Jews felt as safe as Jews anywhere felt safe. Levine quotes a Toronto rabbi as saying, “Living in Toronto, my children don’t know that Jews are a minority.” Read more…

The Anarchists Who Took the Commuter Train

A matchbook ad for Pennsylvania Railroad, 1940. Jim Heimann Collection / Getty.

Amanda Kolson Hurley | An excerpt from Radical Suburbs: Experimental Living on the Fringes of the American City | Belt Publishing | April 2019 | 19 minutes (4,987 words)

The Stelton colony in central New Jersey was founded in 1915. Humble cottages (some little more than shacks) and a smattering of public buildings ranged over a 140-acre tract of scrubland a few miles north of New Brunswick. Unlike America’s better-known  experimental settlements of the nineteenth century, rather than a refuge for a devout religious sect, Stelton was a hive of political radicals, where federal agents came snooping during the Red Scare of 1919-1920. But it was also a suburb, a community of people who moved out of the city for the sake of their children’s education and to enjoy a little land and peace. They were not even the first people to come to the area with the same idea: There was already a German socialist enclave nearby, called Fellowship Farm.

The founders of Stelton were anarchists. In the twenty-first century, the word “anarchism” evokes images of masked antifa facing off against neo-Nazis. What it meant in the early twentieth century was different, and not easily defined. The anarchist movement emerged in the mid-nineteenth century alongside Marxism, and the two were allied for a time before a decisive split in 1872. Anarchist leader Mikhail Bakunin rejected the authority of any state — even a worker-led state, as Marx envisioned — and therefore urged abstention from political engagement. Engels railed against this as a “swindle.”

But anarchism was less a coherent, unified ideology than a spectrum of overlapping beliefs, especially in the United States. Although some anarchists used violence to achieve their ends, like Leon Czolgosz, who assassinated President William McKinley in 1901, others opposed it. Many of the colonists at Stelton were influenced by the anarcho-pacifism of Leo Tolstoy and by the land-tax theory of Henry George. The most venerated hero was probably the Russian scientist-philosopher Peter Kropotkin, who argued that voluntary cooperation (“mutual aid”) was a fundamental drive of animals and humans, and opposed centralized government and state laws in favor of small, self-governing, voluntary associations such as communes and co-ops. Read more…

How the Guardian Went Digital

Newscast Limited via AP Images

Alan Rusbridger | Breaking News | Farrar, Straus and Giroux | November 2018 | 31 minutes (6,239 words)

 

In 1993 some journalists began to be dimly aware of something clunkily referred to as “the information superhighway” but few had ever had reason to see it in action. At the start of 1995 only 491 newspapers were online worldwide: by June 1997 that had grown to some 3,600.

In the basement of the Guardian was a small team created by editor in chief Peter Preston — the Product Development Unit, or PDU. The inhabitants were young and enthusiastic. None of them were conventional journalists: I think the label might be “creatives.” Their job was to think of new things that would never occur to the largely middle-aged reporters and editors three floors up.

The team — eventually rebranding itself as the New Media Lab — started casting around for the next big thing. They decided it was the internet. The creatives had a PC actually capable of accessing the world wide web. They moved in hipper circles. And they started importing copies of a new magazine, Wired — the so-called Rolling Stone of technology — which had started publishing in San Francisco in 1993, along with the HotWired website. “Wired described the revolution,” it boasted. “HotWired was the revolution.” It was launched in the same month the Netscape team was beginning to assemble. Only 18 months later Netscape was worth billions of dollars. Things were moving that fast.

In time, the team in PDU made friends with three of the people associated with Wired. They were the founders, Louis Rossetto, and Jane Metcalfe; and the columnist Nicholas Negroponte, who was based at the Massachusetts Institute of Technology and who wrote mindblowing columns predicting such preposterous things as wristwatches which would “migrate from a mere timepiece today to a mobile command-and-control center tomorrow . . . an all-in-one, wrist-mounted TV, computer, and telephone.”

As if.

Both Rossetto and Negroponte were, in their different ways, prophets. Rossetto was a hot booking for TV talk shows, where he would explain to baffled hosts what the information superhighway meant. He’d tell them how smart the internet was, and how ethical. Sure, it was a “dissonance amplifier.” But it was also a “driver of the discussion” towards the real. You couldn’t mask the truth in this new world, because someone out there would weigh in with equal force. Mass media was one-way communication. The guy with the antenna could broadcast to billions, with no feedback loop. He could dominate. But on the internet every voice was going to be equal to every other voice.

“Everything you know is wrong,” he liked to say. “If you have a preconceived idea of how the world works, you’d better reconsider it.”

Negroponte, 50-something, East Coast gravitas to Rossetto’s Californian drawl, was working on a book, Being Digital, and was equally passionate in his evangelism. His mantra was to explain the difference between atoms — which make up the physical artifacts of the past — and bits, which travel at the speed of light and would be the future. “We are so unprepared for the world of bits . . . We’re going to be forced to think differently about everything.”

I bought the drinks and listened.

Over dinner in a North London restaurant, Negroponte started with convergence — the melting of all boundaries between TV, newspapers, magazines, and the internet into a single media experience — and moved on to the death of copyright, possibly the nation state itself. There would be virtual reality, speech recognition, personal computers with inbuilt cameras, personalized news. The entire economic model of information was about to fall apart. The audience would pull rather than wait for old media to push things as at present. Information and entertainment would be on demand. Overly hierarchical and status-conscious societies would rapidly erode. Time as we knew it would become meaningless — five hours of music would be delivered to you in less than five seconds. Distance would become irrelevant. A UK paper would be as accessible in New York as it was in London.

Writing 15 years later in the Observer, the critic John Naughton compared the begetter of the world wide web, Sir Tim Berners-Lee, with the seismic disruption five centuries earlier caused by the invention of movable type. Just as Gutenberg had no conception of his invention’s eventual influence on religion, science, systems of ideas, and democracy, so — in 2008 — “it will be decades before we have any real understanding of what Berners-Lee hath wrought.”

The entire economic model of information was about to fall apart.

And so I decided to go to America with the leader of the PDU team, Tony Ageh, and see the internet for myself. A 33-year-old “creative,” Ageh had had exactly one year’s experience in media — as an advertising copy chaser for The Home Organist magazine — before joining the Guardian. I took with me a copy of The Internet for Dummies. Thus armed, we set off to America for a four-day, four-city tour.

In Atlanta, we found the Atlanta Journal-Constitution (AJC), which was considered a thought leader in internet matters, having joined the Prodigy Internet Service, an online service offering subscribers information over dial-up 1,200 bit/second modems. After four months the internet service had 14,000 members, paying 10 cents a minute to access online banking, messaging, full webpage hosting and live share prices.

The AJC business plan envisaged building to 35,000 or 40,000 by year three. But that time, they calculated, they would be earning $3.3 million in subscription fees and $250,000 a year in advertising. “If it all goes to plan,’ David Scott, the publisher, Electronic Information Service, told us, ‘it’ll be making good money. If it goes any faster, this is a real business.”

We also met Michael Gordon, the managing editor. “The appeal to the management is, crudely, that it is so much cheaper than publishing a newspaper,” he said.

We wrote it down.

“We know there are around 100,000 people in Atlanta with PCs. There are, we think, about one million people wealthy enough to own them. Guys see them as a toy; women see them as a tool. The goldmine is going to be the content, which is why newspapers are so strongly placed to take advantage of this revolution. We’re out to maximize our revenue by selling our content any way we can. If we can sell it on CD-ROM or TV as well, so much the better.”

“Papers? People will go on wanting to read them, though it’s obviously much better for us if we can persuade them to print them in their own homes. They might come in customized editions. Edition 14B might be for females living with a certain income.”

It was heady stuff.

From Atlanta we hopped up to New York to see the Times’s online service, @Times. We found an operation consisting of an editor plus three staffers and four freelancers. The team had two PCs, costing around $4,000 each. The operation was confident, but small.

The @Times content was weighted heavily towards arts and leisure. The opening menus offered a panel with about 15 reviews of the latest films, theatre, music, and books – plus book reviews going back two years. The site offered the top 15 stories of the day, plus some sports news and business.

There was a discussion forum about movies, with 47 different subjects being debated by 235 individual subscribers. There was no archive due to the fact that — in one of the most notorious newspaper licensing cock-ups in history — the NYT in 1983 had given away all rights to its electronic archive (for all material more than 24 hours old) in perpetuity to Mead/Lexis.

That deal alone told you how nobody had any clue what was to come.

We sat down with Henry E. Scott, the group director of @Times. “Sound and moving pictures will be next. You can get them now. I thought about it the other day, when I wondered about seeing 30 seconds of The Age of Innocence. But then I realized it would take 90 minutes to download that and I could have seen more or less the whole movie in that time. That’s going to change.”

But Scott was doubtful about the lasting value of what they were doing — at least, in terms of news. “I can’t see this replacing the news- paper,” he said confidently. “People don’t read computers unless it pays them to, or there is some other pressing reason. I don’t think anyone reads a computer for pleasure. The San Jose Mercury [News] has put the whole newspaper online. We don’t think that’s very sensible. It doesn’t make sense to offer the entire newspaper electronically.”

We wrote it all down.

“I can’t see the point of news on-screen. If I want to know about a breaking story I turn on the TV or the radio. I think we should only do what we can do better than in print. If it’s inferior than the print version there’s no point in doing it.”

Was there a business plan? Not in Scott’s mind. “There’s no way you can make money out of it if you are using someone else’s server. I think the LA Times expects to start making money in about three years’ time. We’re treating it more as an R & D project.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


From New York we flitted over to Chicago to see what the Tribune was up to. In its 36-storey Art Deco building — a spectacular monument to institutional self-esteem — we found a team of four editorial and four marketing people working on a digital service, with the digital unit situated in the middle of the newsroom. The marketeers were beyond excited about the prospect of being able to show houses or cars for sale and arranged a demonstration. We were excited, too, even if the pictures were slow and cumbersome to download.

We met Joe Leonard, associate editor. “We’re not looking at Chicago Online as a money maker. We’ve no plans even to break even at this stage. My view is simply that I’m not yet sure where I’m going, but I’m on the boat, in the water — and I’m ahead of the guy who is still standing on the pier.”

Reach before revenue.

Finally we headed off to Boulder, Colorado, in the foothills of the Rockies, where Knight Ridder had a team working on their vision of the newspaper of tomorrow. The big idea was, essentially, what would become the iPad — only the team in Boulder hadn’t got much further than making an A4 block of wood with a “front page” stuck on it. The 50-something director of the research centre, Roger Fidler, thought the technology capable of realizing his dream of a ‘personal information appliance’ was a couple of years off.

Tony and I had filled several notebooks. We were by now beyond tired and talked little over a final meal in an Italian restaurant beneath the Rocky Mountains.

We had come. We had seen the internet. We were conquered.

* * *

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry. We had met navigators and pioneers. They could dimly glimpse where the future lay. Not one of them had any idea how to make a dime out of it, but at the same time they intuitively sensed that it would be more reckless not to experiment. It seemed reasonable to assume that — if they could be persuaded to take the internet seriously — their companies would dominate in this new world, as they had in the old world.

We were no different. After just four days it seemed blindingly obvious that the future of information would be mainly digital. Plain old words on paper — delivered expensively by essentially Victorian production and distribution methods — couldn’t, in the end, compete. The future would be more interactive, more image-driven, more immediate. That was clear. But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print? How could you convince anyone that this should be a priority when no one had yet worked out how to make any money out of it? The change, and therefore the threat, was likely to happen rapidly and maybe violently. How quickly could we make a start? Or was this something that would be done to us?

In a note for Peter Preston on our return I wrote, “The internet is fascinating, intoxicating . . . it is also crowded out with bores, nutters, fanatics and middle managers from Minnesota who want the world to see their home page and CV. It’s a cacophony, a jungle. There’s too much information out there. We’re all overloaded. You want someone you trust to fillet it, edit it and make sense of it for you. That’s what we do. It’s an opportunity.”

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry.

I spent the next year trying to learn more and then the calendar clicked on to 1995 — The Year the Future Began, at least according to a recent book by the cultural historian W. Joseph Campbell, who used the phrase as his book title twenty years later. It was the year Amazon.com, eBay, Craigslist, and Match.com established their presence online. Microsoft spent $300m launching Windows 95 with weeks of marketing hype, spending millions for the rights to the Rolling Stones hit “Start Me Up,” which became the anthem for the Windows 95 launch.

Cyberspace — as the cyber dystopian Evgeny Morozov recalled, looking back on that period — felt like space itself. “The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (‘Internet Explorer,’ ‘Netscape Navigator’).”

But, as Campbell was to reflect, “no industry in 1995 was as ill-prepared for the digital age, or more inclined to pooh-pooh the disruptive potential of the Internet and World Wide Web, than the news business.” It suffered from what he called “innovation blindness” — “an inability, or a disinclination to anticipate and understand the consequences of new media technology.”

1995 was, then, the year the future began. It happened also to be the year in which I became editor of the Guardian.

* * *

I was 41 and had not, until very recently, really imagined this turn of events. My journalism career took a traditional enough path. A few years reporting; four years writing a daily diary column; a stint as a feature writer — home and abroad. In 1986 I left the Guardian to be the Observer’s television critic. When I rejoined the Guardian I was diverted towards a route of editing — launching the paper’s Saturday magazine followed by a daily tabloid features section and moving to be deputy editor in 1993. Peter Preston — unshowy, grittily obstinate, brilliantly strategic — looked as if he would carry on editing for years to come. It was a complete surprise when he took me to the basement of the resolutely unfashionable Italian restaurant in Clerkenwell he favored, to tell me he had decided to call it a day.

On most papers the proprietor or chief executive would find an editor and take him or her out to lunch to do the deal. On the Guardian — at least according to tradition dating back to the mid-70s — the Scott Trust made the decision after balloting the staff, a process that involved manifestos, pub hustings, and even, by some candidates, a little frowned-on campaigning.

I supposed I should run for the job. My mission statement said I wanted to boost investigative reporting and get serious about digital. It was, I fear, a bit Utopian. I doubt much of it impressed the would-be electorate. British journalists are programmed to skepticism about idealistic statements concerning their trade. Nevertheless, I won the popular vote and was confirmed by the Scott Trust after an interview in which I failed to impress at least one Trustee with my sketchy knowledge of European politics. We all went off for a drink in the pub round the back of the office. A month later I was editing.

“Fleet Street,” as the UK press was collectively called, was having a torrid time, not least because the biggest beast in the jungle, Rupert Murdoch, had launched a prolonged price war that was playing havoc with the economics of publishing. His pockets were so deep he could afford to slash the price of The Times almost indefinitely — especially if it forced others out of business.

Reach before revenue — as it wasn’t known then.

The newest kid on the block, the Independent, was suffering the most. To their eyes, Murdoch was behaving in a predatory way. We calculated the Independent titles were losing around £42 million (nearly £80 million in today’s money). Murdoch’s Times, by contrast, had seen its sales rocket 80 per cent by cutting its cover prices to below what it cost to print and distribute. The circulation gains had come at a cost — about £38 million in lost sales revenue. But Murdoch’s TV business, BSkyB, was making booming profits and the Sun continued to throw off huge amounts of cash. He could be patient.

But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print.

The Telegraph had been hit hard — losing £45 million in circulation revenues through cutting the cover price by 18 pence. The end of the price war left it slowly clawing back lost momentum, but it was still £23 million adrift of where it had been the previous year. Murdoch — as so often — had done something bold and aggressive. Good for him, not so good for the rest of us. Everyone was tightening their belts in different ways. The Independent effectively gave up on Scotland. The Guardian saved a million a year in newsprint costs by shaving half an inch off the width of the paper.

The Guardian, by not getting into the price war, had “saved” around £37 million it would otherwise have lost. But its circulation had been dented by about 10,000 readers a day. Moreover, the average age of the Guardian reader was 43 — something that pre-occupied us rather a lot. We were in danger of having a readership too old for the job advertisements we carried.

Though the Guardian itself was profitable, the newspaper division was losing nearly £12 million (north of £21 million today). The losses were mainly due to the sister Sunday title, the Observer, which the Scott Trust had purchased as a defensive move against the Independent in 1993. The Sunday title had a distinguished history, but was hemorrhaging cash: £11 million losses.

Everything we had seen in America had to be put on hold for a while. The commercial side of the business never stopped reminding us that only three percent of households owned a PC and a modem.

* * *

But the digital germ was there. My love of gadgets had not extended to understanding how computers actually worked, so I commissioned a colleague to write a report telling me, in language I could understand, how our computers measured up against what the future would demand. The Atex system we had installed in 1987 gave everyone a dumb terminal on their desk — little more than a basic word processor. It couldn’t connect to the internet, though there was a rudimentary internal messaging system. There was no word count or spellchecker and storage space was limited. It could not be used with floppy disks or CD-ROMs. Within eight years of purchase it was already a dinosaur.

There was one internet connection in the newsroom, though most reporters were unaware of it. It was rumored that downstairs a bloke called Paul in IT had a Mac connected to the internet through a dial-up modem. Otherwise we were sealed off from the outside world.

Some of these journalist geeks began to invent Heath Robinson solutions to make the inadequate kit in Farringdon Road to do the things we wanted in order to produce a technology website online. Tom Standage — he later became deputy editor of the Economist, but then was a freelance tech writer — wrote some scripts to take articles out of Atex and format them into HTML so they could be moved onto the modest Mac web server — our first content management system, if you like. If too many people wanted to read this tech system at once the system crashed. So Standage and the site’s editor, Azeem Azhar, would take it in turns sitting in the server room in the basement of the building rebooting the machines by hand — unplugging them and physically moving the internet cables from one machine to another.

What would the future look like? We imagined personalized editions, even if we had not the faintest clue how to produce them. We guessed that readers might print off copies of the Guardian in their homes — and even toyed with the idea of buying every reader a printer. There were glimmers of financial hope. Our readers were spending £56 million a year buying the Guardian but we retained none of it: the money went on paper and distribution. In the back of our minds we ran calculations about how the economics of newspapers would change if we could save ourselves the £56 million a year “old world” cost.

By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future.

On top of editing, the legal entanglements sometimes felt like a full-time job on their own. Trying to engineer a digital future for the Guardian felt like a third job. There were somehow always more urgent issues. By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future. No plans in the new world lasted very long.

It was now apparent that we couldn’t get away with publishing selective parts of the Guardian online. Other newspapers had shot that fox by pushing out everything. We were learning about the connectedness of the web — and the IT team tentatively suggested that we might use some “offsite links” to other versions of the same story to save ourselves the need to write our own version of everything. This later became the mantra of the City University of New York (CUNY) digital guru Jeff Jarvis — “Do what you do best, and link to the rest.”

We began to grapple with numerous basic questions about the new waters into which we were gingerly dipping our toes.

Important question: Should we charge?

The Times and the Telegraph were both free online. A March 1996 memo from Bill Thompson, a developer who had joined the Guardian from Pipex, ruled it out:

I do not believe the UK internet community would pay to read an online edition of a UK newspaper. They may pay to look at an archive, but I would not support any attempt to make the Guardian a subscription service online . . . It would take us down a dangerous path.

In fact, I believe that the real value from an online edition will come from the increased contact it brings with our readers: online newspapers can track their readership in a way that print products never can, and the online reader can be a valuable commodity in their own right, even if they pay nothing for the privilege.

Thompson was prescient about how the overall digital economy would work — at least for players with infinitely larger scale and vastly more sophisticated technology.

What time of day should we publish?

The electronic Telegraph was published at 8 a.m. each day — mainly because of its print production methods. The Times, more automated, was available as soon as the presses started rolling. The Guardian started making some copy available from first edition through to the early hours. It would, we were advised, be fraught with difficulties to publish stories at the same time they were ready for the press.

Why were we doing it anyway?

Thompson saw the dangers of cannibalization, that readers would stop buying the paper if they could read it for free online. It could be seen as a form of marketing. His memo seemed ambivalent as to whether we should venture into this new world at all:

The Guardian excels in presenting information in an attractive easy to use and easy to navigate form. It is called a “broadsheet newspaper.” If we try to put the newspaper on-line (as the Times has done) then we will just end up using a new medium to do badly what an old medium does well. The key question is whether to make the Guardian a website, with all that entails in terms of production, links, structure, navigational aids etc. In summer 1995 we decided that we would not do this.

But was that still right a year later? By now we had the innovation team — PDU — still in the basement of one building in Farringdon Road, and another team in a Victorian loft building across the way in Ray Street. We were, at the margins, beginning to pick up some interesting fringe figures who knew something about computers, if not journalism. But none of this was yet pulling together into a coherent picture of what a digital Guardian might look like.

An 89-page business plan drawn up in October 1996 made it plain where the priorities lay: print.

We wanted to keep growing the Guardian circulation — aiming a modest increase to 415,000 by March 2000 — which would make us the ninth-biggest paper in the UK — with the Observer aiming for 560,000 with the aid of additional sections. A modest investment of £200,000 a year in digital was dwarfed by an additional £6 million cash injection into the Observer, spread over three years.

As for “on-line services” (we were still hyphenating it) we did want “a leading-edge presence” (whatever that meant), but essentially we thought we had to be there because we had to be there. By being there we would learn and innovate and — surely? — there were bound to be commercial opportunities along the road. It wasn’t clear what.

We decided we might usefully take broadcasting, rather than print, as a model — emulating its “immediacy, movement searchability and layering.”

If this sounded as if we were a bit at sea, we were. We hadn’t published much digitally to this point. We had taken half a dozen meaty issues — including parliamentary sleaze, and a feature on how we had continued to publish on the night our printing presses had been blown up by the IRA — and turned them into special reports.

It is a tribute to our commercial colleagues that they managed to pull in the thick end of half a million pounds to build these websites. Other companies’ marketing directors were presumably like ours — anxious about the youth market and keen for their brands to feel “cool.” In corporate Britain in 1996, there was nothing much cooler than the internet, even if not many people had it, knew where to find it or understood what to do with it.

* * *

The absence of a controlling owner meant we could run the Guardian in a slightly different way from some papers. Each day began with a morning conference open to anyone on the staff. In the old Farringdon Road office, it was held around two long narrow tables in the editor’s office — perhaps 30 or 40 people sitting or standing. When we moved to our new offices at Kings Place, near Kings Cross in North London, we created a room that was, at least theoretically, less hierarchical: a horseshoe of low yellow sofas with a further row of stools at the back. In this room would assemble a group of journalists, tech developers and some visitors from the commercial departments every morning at about 10 a.m. If it was a quiet news day we might expect 30 or so. On big news days, or with an invited guest, we could host anything up to 100.

A former Daily Mail journalist, attending his first morning conference, muttered to a colleague in the newsroom that it was like Start the Week — a Monday morning BBC radio discussion program. All talk and no instructions. In a way, he was right: It was difficult, in conventional financial or efficiency terms, to justify 50 to 60 employees stopping work to gather together each morning for anything between 25 and 50 minutes. No stories were written during this period, no content generated.

But something else happened at these daily gatherings. Ideas emerged and were kicked around. Commissioning editors would pounce on contributors and ask them to write the thing they’d just voiced. The editorial line of the paper was heavily influenced, and sometimes changed, by the arguments we had. The youngest member of staff would be in the same room as the oldest: They would be part of a common discussion around news. By a form of accretion and osmosis an idea of the Guardian was jointly nourished, shared, handed down, and crafted day by day.

You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was.

It led to a very strong culture. You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was. It could sometimes feel an intimidating meeting — even for, or especially for, the editor. The culture was intended to be one of challenge: If we’d made a wrong decision, or slipped up factually or tonally, someone would speak up and demand an answer. But challenge was different from blame: It was not a meeting for dressing downs or bollockings. If someone had made an error the previous day we’d have a post-mortem or unpleasant conversation outside the room. We’d encourage people to want to contribute to this forum, not make them fear disapproval or denunciation.

There was a downside to this. It could, and sometimes did, lead to a form of group-think. However herbivorous the culture we tried to nurture, I was conscious of some staff members who felt awkward about expressing views outside what we hoped was a  fairly broad consensus. But, more often, there would be a good discussion on two or three of the main issues of the day. We encouraged specialists or outside visitors to come in and discuss breaking stories. Leader writers could gauge the temperature of the paper before penning an editorial. And, from time to time, there would be the opposite of consensus: Individuals, factions, or groups would come and demand we change our line on Russia, bombing in Bosnia; intervention in Syria; Israel, blood sports, or the Labor leadership.

The point was this: that the Guardian was not one editor’s plaything or megaphone. It emerged from a common conversation — and was open to internal challenge when editorial staff felt uneasy about aspects of our journalism or culture.

* * *

Within two years — slightly uncomfortable at the power I had acquired as editor — I gave some away. I wanted to make correction a natural part of the journalistic process, not a bitterly contested post-publication battleground designed to be as difficult as possible.

We created a new role on the Guardian: a readers’ editor. He or she would be the first port of call for anyone wanting to complain about anything we did or wrote. The readers’ editor would have daily space in the paper — off-limits to the editor — to correct or clarify anything and would also have a weekly column to raise broader issues of concern. It was written into the job description that the editor could not interfere. And the readers’ editor was given the security that he/she could not be removed by the editor, only by the Scott Trust.

On most papers editors had sat in judgment on themselves. They commissioned pieces, edited and published them — and then were supposed neutrally to assess whether their coverage had, in fact, been truthful, fair, and accurate. An editor might ask a colleague — usually a managing editor — to handle a complaint, but he/she was in charge from beginning to end. It was an autocracy. That mattered even more in an age when some journalism was moving away from mere reportage and observation to something closer to advocacy or, in some cases, outright pursuit.

Allowing even a few inches of your own newspaper to be beyond your direct command meant that your own judgments, actions, ethical standards and editorial decisions could be held up to scrutiny beyond your control. That, over time, was bound to change your journalism. Sunlight is the best disinfectant: that was the journalist-as-hero story we told about what we do. So why wouldn’t a bit of sunlight be good for us, too?

The first readers’ editor was Ian Mayes, a former arts and obituaries editor then in his late 50s. We felt the first person in the role needed to have been a journalist — and one who would command instant respect from a newsroom which otherwise might be somewhat resistant to having their work publicly critiqued or rebutted. There were tensions and some resentment, but Ian’s experience, fairness and flashes of humor eventually won most people round.

One or two of his early corrections convinced staff and readers alike that he had a light touch about the fallibility of journalists:

In our interview with Sir Jack Hayward, the chairman of Wolverhampton Wanderers, page 20, Sport, yesterday, we mistakenly attributed to him the following comment: “Our team was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Sir Jack had just declined the offer of a hot drink. What he actually said was: “Our tea was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Profuse apologies.

In an article about the adverse health effects of certain kinds of clothing, pages 8 and 9, G2, August 5, we omitted a decimal point when quoting a doctor on the optimum temperature of testicles. They should be 2.2 degrees Celsius below core body temperature, not 22 degrees lower.

But in his columns he was capable of asking tough questions about our editorial decisions —  often prompted by readers who had been unsettled by something we had done. Why had we used a shocking picture which included a corpse? Were we careful enough in our language around mental health or disability? Why so much bad language in the Guardian? Were we balanced in our views of the Kosovo conflict? Why were Guardian journalists so innumerate? Were we right to link to controversial websites?

In most cases Mayes didn’t come down on one side or another. He would often take readers’ concerns to the journalist involved and question them — sometimes doggedly — about their reasoning. We learned more about our readers through these interactions; and we hoped that Mayes’s writings, candidly explaining the workings of a newsroom, helped readers better understand our thinking and processes.

It was, I felt, good for us to be challenged in this way. Mayes was invaluable in helping devise systems for the “proper” way to correct the record. A world in which — to coin a phrase —  you were “never wrong for long” posed the question of whether you went in for what Mayes termed “invisible mending.” Some news organizations would quietly amend whatever it was that they had published in error, no questions asked. Mayes felt differently: The act of publication was something on the record. If you wished to correct the record, the correction should be visible.

But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

We were some years off the advent of social media, in which any error was likely to be pounced on in a thousand hostile tweets. But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

I found liberation in having created this new role. There were few things editors can enjoy less than the furious early morning phone call or email from the irate subject of their journalism. Either the complainant is wrong — in which case there is time wasted in heated self-justification; or they’re right, wholly or partially. Immediately you’re into remorseful calculations about saving face. If readers knew we honestly and rapidly — even immediately — owned up to our mistakes they should, in theory, trust us more. That was the David Broder theory, and I bought it. Readers certainly made full use of the readers’ editor’s existence. Within five years Mayes was dealing with around 10,000 calls, emails, and letters a year — leading to around 1,200 corrections, big and small. It’s not, I think, that we were any more error-prone than other papers. But if you win a reputation for openness, you’d better be ready to take it as seriously as your readers will.

Our journalism became better. If, as a journalist, you know there are a million sleuth-eyed editors out there waiting to leap on your tiniest mistake, it makes you more careful. It changes the tone of your writing. Our readers often know more than we do. That became a mantra of the new world, coined by the blogger and academic Dan Gillmor, in his 2004 book We the Media8 but it was already becoming evident in the late 1990s.

The act of creating a readers’ editor felt like a profound recognition of the changing nature of what we were engaged in. Journalism was not an infallible method guaranteed to result in something we would proclaim as The Truth — but a more flawed, tentative, iterative and interactive way of getting towards something truthful.

Admitting that felt both revolutionary and releasing.

***

Excerpted from Breaking News: The Remaking of Journalism and Why It Matters Now by Alan Rusbridger. Published Farrar, Straus and Giroux November 27, 2018. Copyright © 2018 by Alan Rusbridger. All rights reserved.

Longreads Editor: Aaron Gilbreath

The Ugly History of Beautiful Things: Pearls

Illustration by Jacob Stead

Katy Kelleher | Longreads | March 2019 | 16 minutes (4,107 words)

In The Ugly History of Beautiful Things, Katy Kelleher lays bare the dark underbellies of the things we adorn ourselves with. Previously: the grisly sides of perfume and angora.

* * *

“There was once upon a time a very old woman, who lived with her flock of geese in a waste place among the mountains, and there had a little house,” begins The Goose Girl at the Well. Published by the Brothers Grimm, this strange little story describes a princess who comes to live with a poor crone in that wretched waste place after she fails her father’s Lear-like test to profess her love and devotion. The girl is lovely, as befits a fairy-tale princess — “white as snow, as rosy as apple-blossom, and her hair as radiant as sun-beams” — but there is one detail that always snags in my mind: “When she cried, not tears fell from her eyes, but pearls and jewels only.”

The rest of the story is a bit boring, I’m sorry to say. The girl returns home, the king learns his folly, and the old woman disappears into thin air, taking only the precious stones that fell from the girl’s magical tear ducts. But it ends on a funny note:

This much is certain, that the old woman was no witch, as people thought, but a wise woman who meant well. Very likely it was she who, at the princess’s birth, gave her the gift of weeping pearls instead of tears. That does not happen now-a-days, or else the poor would soon become rich.

I wish Grimm’s narrator had lived to see our world, one where pearls are so inexpensive that almost anyone can own a pearl necklace or a set of earrings. These gemstones are no longer precious, and they come neither from red-rimmed eyes nor from secret caverns in the ocean, but from underwater baskets strung together on sprawling sea-farms. Pearls were once mystical objects, believed by some to be the tears of Eve, by others to be the tears of Aphrodite. There are stories of pearls falling out of women’s mouths when they utter sweet words, and pearls appearing from the spray of sea foam as a goddess is born. Now we know better: pearls are made from some of the basic and common building blocks of nature — calcium, carbon, oxygen, arranged into calcium carbonate particles, bund together by organic proteins. They are created out of animal pain, which has been sublimated into something iridescent and smooth, layered and lovely. Born of irritation, these gemstones can be mass-produced and purchased with the click of a button. These gems, like so many things, have lost some of their luster thanks to the everyday degradation of value that comes with globalization and 24/7 access to consumer goods. Thanks to Amazon, you no longer need to plumb the depths of a river or visit a jeweler to purchase a set of freshwater pearl drops. With one-click ordering, you can have a pair of dangling ivory orbs delivered to your house within days — in some places, hours..

And yet: imagine opening an oyster and seeing that slimy amorphous lump of muscle, and nestled among it, a single pearl. The fact that such iridescent, shape-shifting beauty can come from a mucus-y mollusk remains something of a miracle, primal evidence that the world orients itself toward beauty. Or so I want to believe.

Read more…

Los Angeles Plays Itself

AP Photo/Reed Saxon

David L. Ulin | Sidewalking | University of California Press | October 2015 | 41 minutes (8,144 words)

 

“I want to live in Los Angeles, but not the one in Los Angeles.”

— Frank Black

 

One night not so many weeks ago, I went to visit a friend who lives in West Hollywood. This used to be an easy drive: a geometry of short, straight lines from my home in the mid-Wilshire flats — west on Olympic to Crescent Heights, north past Santa Monica Boulevard. Yet like everywhere else these days, it seems, Los Angeles is no longer the place it used to be. Over the past decade-and-a-half, the city has densified: building up and not out, erecting more malls, more apartment buildings, more high-rises. At the same time, gridlock has become increasingly terminal, and so, even well after rush hour on a weekday evening, I found myself boxed-in and looking for a short-cut, which, in an automotive culture such as this one, means a whole new way of conceptualizing urban space.

There are those (myself among them) who would argue that the very act of living in L.A. requires an ongoing process of reconceptualization, of rethinking not just the place but also our relationship to it, our sense of what it means. As much as any cities, Los Angeles is a work-in-progress, a landscape of fragments where the boundaries we take for granted in other environments are not always clear. You can see this in the most unexpected locations, from Rick Caruso’s Grove to the Los Angeles County Museum of Art, where Chris Burden’s sculpture “Urban Light” — a cluster of 202 working vintage lampposts — fundamentally changed the nature of Wilshire Boulevard when it was installed in 2008. Until then, the museum (like so much of L.A.) had resisted the street, the pedestrian, in the most literal way imaginable, presenting a series of walls to the sidewalk, with a cavernous entry recessed into the middle of a long block. Burden intended to create a catalyst, a provocation; “I’ve been driving by these buildings for 40 years, and it’s always bugged me how this institution turned its back on the city,” he told the Los Angeles Times a week before his project was lit. When I first came to Los Angeles a quarter of a century ago, the area around the Museum was seedy; it’s no coincidence that in the film Grand Canyon, Mary Louise Parker gets held up at gunpoint there. Take a walk down Wilshire now, however, and you’ll find a different sort of interaction: food trucks, pedestrians, tourists, people from the neighborhood.

Read more…

Maybe What We Need Is … More Politics?

Alfred Gescheidt / Getty Images

Aaron Timms | Longreads | February 2019 | 20 minutes (5,514 words)

Alpacas are native to South America, but to find the global center of alpaca spinning you’ll need to travel to Bradford, England. The man most responsible for this quirk of history is Titus Salt. Until the 1830s alpaca yarn was considered an unworkable material throughout Europe. Salt, a jobbing young entrepreneur from the north of England, commercialized a form of alpaca warp that made the animal’s fleece suitable for mass production. Within a decade alpaca, finer and softer than wool, had become the rage of England’s fashionable classes.

Already by the mid-19th century industrialization had begun to disfigure the English countryside with “machinery and tall chimneys, out of which interminable serpents of smoke trailed themselves for ever and ever, and never got uncoiled,” as Dickens put it in Bleak House. The immiseration of the working classes was under way. Troubled by the emerging horrors of the new industrial age, Salt built a model village to house the workers he employed in his textile mill. Saltaire, with its neat, spacious houses, running water, efficient sewerage, parks, schools and recreational facilities, became a symbol of what enlightened capitalism could look like. It was also a model in the truest sense, serving as the inspiration for workers’ villages built later in the 19th century by companies such as Cadbury’s and Lever Brothers, the soap manufacturer that eventually became Unilever.

According to economist Paul Collier, these Victorian capitalists instituted a tradition that survives, however precariously, today: the tradition of “business with purpose, business with a sense of obligation to a workforce and a community.” Among the modern successors of this model of compassionate capitalism, Collier has argued, are U.S. pharmaceutical giant Johnson & Johnson and John Lewis & Partners, the British department store. In the 1940s Johnson & Johnson set out a credo stating that the company’s first responsibility was to its customers. Thanks to this credo, Johnson & Johnson’s management led a mass recall of Tylenol off supermarket and pharmacy shelves following a contamination scare in the early 1980s. Now standard practice, this type of product recall was uncommon for its time — and allowed the company to maintain goodwill with its customers. John Lewis, for its part, has prospered through difficult decades for brick-and-mortar retail largely thanks to its unusual power structure: the company is owned by a trust run in the interests of its workforce.

The thread uniting this strain of capitalism, Collier contends in his new book The Future of Capitalism: Facing The New Anxieties, is ethics. An ethics of reciprocal responsibility and care — between owners, workers, and customers — has allowed different businesses to prosper in different eras without destroying the communities and environments around them. But very few businesses are run according to these principles today. According to Collier, it is to this model of reciprocal ethics that capitalism, having lost its way over the past four decades, now must return — and reciprocity must become the principle that guides human interaction at all levels of society, not just in the firm. “Our sense of mutual regard has to be rebuilt,” he says. “Public policy needs to be complemented by a sense of purpose among firms.” “We need to meet each other.” “A new generation needs to reset social narratives.” “Norms need to change.” Prescriptivism today, the future of capitalism tomorrow. Read more…

Lean On

Getty / Bloomsbury Publishing

Briallen Hopper | excerpted from Hard to Love: Essays and Confessions | February 2019 | 25 minutes (6,215 words)

I like to lean. Too much of the time I have to hold myself up, so if an opportunity to swoon presents itself, I take it. When I’m getting a haircut and the lady asks me to lean back into the basin for a shampoo, I let myself melt. My muscles go slack, my eyes fall shut, and there is nothing holding me except gravity and the chair and the water and her hands on my head. I feel my tears of bliss slide into the suds.

In photos I am often leaning. When I’m not resting my head on someone’s shoulder, I am hugging a column in a haunted castle in Great Barrington or bracing myself against a big block of basalt on a pedestal in a Barcelona park. At home alone, I improvise with bookshelves and doorjambs, but sometimes I need to lean on something alive. Seeking support on a stormy night, I run out into the rain and lean against the dogwood tree in front of my house until the wet bark soaks through my coat. The world is my trellis.

Ten years ago, I bought a Gordon Parks print of Paul Newman and Joanne Woodward leaning against each other by lamplight on a big brass bed. They are sitting side by side, eyes closed, serene. He is leaning more heavily, his body slanted into hers, his head on her shoulder. She is resting more gently, her cheek against the top of his head. Her face is half-illuminated, half-eclipsed. They seem solemn and private and young. He is quiet in her shadow.

I hung the photograph over my bed. Next to it I tacked another 1950s Paul and Joanne picture I tore out of a book. They are leaning on a bed again, and he is still slumped against her shoulder, but this time the lean seems more in league with an audience. They are both meeting the photographer’s gaze and smiling small smiles. Her eyebrows are slightly raised; she might be sly or smug. She is holding a cup of tea in one hand, and his head, proprietarily, with the other. He is supine and sated and holding a glass of wine.

Paul and Joanne liked to lean for the camera. For their 1968 LIFE cover promoting Rachel, Rachel (she starred, he directed), they are layered on wall-to-wall carpet; she is reclining in the foreground, and he is her blue-eyed backrest. In yet another famous photo from an earlier era (Joanne is still in gingham, not yet in Pucci), they are leaning back to back with their shoulders against each other, their mutual pressure holding each other up, with an isosceles triangle of space between them, and a sturdy baseline of brick patio beneath them.

I like to fall asleep under images of leaning every night and wake up beneath them every day.

I like to believe that leaning is love.
Read more…

An Oral History of Detroit Punk Rock

Negative Approach playing the Freezer, Detroit, early 1982. Photo by Davo Scheich

Steve Miller | Detroit Rock City | DaCapo Press | June 2013 | 39 minutes (7,835 words)

 

Detroit is known for many things: Motown, automobiles, decline and rebirth. This is the story of Detroit’s punk and hardcore music scenes, which thrived in the suffering city center between the late-1970s and mid-80s. Told by the players themselves, it’s adapted from Steve Miller’s lively, larger oral history Detroit Rock City, which covers everyone from Iggy and the Stooges to the Gories to the White StripesOur thanks to Miller and DaCapo for sharing this with the Longreads community.

* * *

Don Was (Was (Not Was) bassist, vocalist; Traitors, vocalist, producer; Rolling Stones, Bob Dylan, Bonnie Raitt, Iggy Pop): So in the seventies I used to read the Village Voice, and I started seeing the ads for CBGB and these bands with the crazy names…and I told Jack [Tann, friend and local music producer] about it: “There must be some way to create something like that here. There must be bands like this here.” I formed a band called the Traitors, and Jack became a punk rock promoter, which wasn’t the way to approach music like that. It was supposed to look cooler than to go in like P. T. Barnum.

Mark Norton (Ramrods, 27 vocalist, journalist, Creem magazine): We were trying to figure out what was next. I called CBGB in ’75 or early ’76; there was a girl who tended bar there named Susan Palermo, she worked there for ages. And she would tell Hilly Kristal: “Hey, there’s this crazy guy from Detroit—he’s calling again.” I’d say, “Could you just put the phone down so I could listen to the groups?” I heard part of a set by the Talking Heads like that. It sounded like it was through a phone, but I was getting all excited, you know—this sounds like what I like. My phone bill was incredible, $200 bucks. In the summer of 1976 I went to New York City. I saw the second Dead Boys show at CBGB. I saw the Dictators. Handsome Dick and his girlfriend at the time, Jodi at the time, said, “Who are you?” I said, “I’m from Detroit.” They said, “Have you ever seen the Stooges?” “Yeah man, I saw them millions of times, the best shows, the ones in Detroit.” I was thinking, “none of these people have seen shit.’

Chris Panackia , aka Cool Chris (sound man at every locale in Detroit): The only people that could stand punk rock music were the gays, and Bookie’s was a drag bar, so they accepted them as “look at them. They’re different.” “They’re expressing themselves.” Bookie’s became the place that you could play. Bookie’s had its clique, and there were a lot of bands that weren’t in that clique. Such as Cinecyde. The Mutants really weren’t. Bookie’s bands were the 27, which is what the Ramrods became. Coldcock, the Sillies, the Algebra Mothers, RUR. Vince Bannon and Scott Campbell had…Bookie’s because it was handed to them basically. You know, “Okay, let’s do this punk rock music. We got a place.” To get a straight bar to allow these bands that drew flies to play at a Friday and Saturday night was nearly impossible. What bar owner is going to say, “Oh yeah, you guys can play your originals, wreck the place, and have no people”? Perfect for a bar owner. Loves that, right? There really wasn’t another venue.

Read more…