Search Results for: The Daily Beast

Can We Ever Make It Suntory Time Again?

Keith Bishop / Getty, Illustration by Homestead Studio

Aaron Gilbreath | Longreads | October 2019 | 23 minutes (5,939 words)

Bic Camera looked like many of the other loud, brightly colored electronics stores I’d seen in Japan, just bigger. Mostly, it was a respite from the cold. The appliances and electronics that jammed its interior gave no indication of its dizzyingly good liquor selection, nor did the many inexpensive aged Japanese whiskies hint that affordable bottles were about to become a thing of the past, or that I’d nurture a profound remorse once they did. When I found Bic Camera’s wholly unexpected liquor department, I lifted two bottles of high-end Japanese whisky from the shelf, wandered the aisles studying the labels, had a baffling interaction with a clerk, and put the bottles back on the shelf. All I had to do was pay for them. I didn’t.

Commercial Japanese whisky has been around since at least 1929, so during my first trip to Japan (and at home in the U.S.), there was no reason to think that all the aged Japanese whiskies that were readily available in the early 2000s would soon achieve holy grail status. In 2007, there were $100 bottles of Yamazaki 18-year sitting forlornly on a shelf at my local BevMo. One bottle now sells for more than $400 at online auctions; some online stores sell them for $700.

Yoichi 10, Yoichi 12, Hibiki 17 and 21, Taketsuru 12 and 17 — in 2014, rare and discontinued bottles lined store shelves, reasonably priced compared to their current $300 to $600 price tags. Those were great years. I call them BTB — before the boom. Before the boom, a bottle of Yamazaki 12 cost $60. After the boom, a Seattle liquor store priced their last bottle of Yamazaki 12 at $225. Before the boom, Taketsuru 12 cost $20 in Japan and $70 in the States. After the boom, online auctions sell bottles for more than $220.

Before the boom, Karuizawa casks sat, dusty and abandoned, in shuttered distilleries. After the boom, a bottle of Karuizawa 1964 sold for $118,420, the most expensive Japanese whisky ever sold at auction, until a Yamazaki 50 sold for $129,186 the following year, then another went for $343,000 15 months later.

Before the boom, whisky tasted of rich red fruits and cereal grains. After the boom, it tasted of regret.

I’ve spent the past five years wishing I could do things over. I remember my trips to Japan fondly — the new friends, the food and record stores, the Kyoto temples and solitary hikes — except for the whisky, whose absence coats my mouth with the proverbial bitter taste. I replay the time I walked into a grocery store in Tokyo’s Ikebukuro neighborhood and found a shelf lined with Taketsuru 12, four bottles wide and four deep, at $20 apiece; it starts at $170 now. I look at the photos I took of Hibiki 12 for $34, Yoichi 12 for $69, Taketsuru 21 for $89. I tell friends how I’d visited the Isetan Department Store’s liquor department in Shinjuku, where they had a 12-year-old sherried Karuizawa bottled exclusively for Isetan for barely more than $100, alongside a blend of Hanyu and Kawaski grain whisky that famed distiller Ichiro Akuto did exclusively for the store. Staff wouldn’t let me photograph or touch anything, but I could have afforded both bottles. They now sell for $1,140 and $1,290, respectively. I torture myself by revisiting my unfortunate logic, how I squandered my limited funds: buying inexpensive bottles to drink during the trip, instead of a few big-ticket purchases to take home.

Aaron, I’ve thought more times that I could count, you are such a fucking idiot.

To time travel, I look at photos of old Japanese whisky bottles in Facebook groups, like they are some sort of beverage porn, and wonder: Who am I? What have I become? There’s enough incredible scotch available here at home. Why do I — and the others whose interest spiked prices and made the bottles we loved inaccessible — care so much about Japanese whisky? Read more…

The Art of Acceptance Speech Giving

Angela Weiss / Getty, Illustration by Homestead Studio

Michael Musto | Longreads | September 2019 | 9 minutes (2,135 words)

We’ve heard it a million times: “I was nothing until I got this award, and now I’m everything. But this honor isn’t really for me. It’s for you — all the little people out there in the dark, who now have all the inspiration you need to know that someday you can be as great as I am. You just might be holding this trophy someday long into the future — though right now, it’s me! And I love it! Thank you to the Academy, CAA, and God — in that order!!!!”

Inspirational, right? Nope. That’s actually a tone deaf, self aggrandizing approach to an awards speech, and we usually end up loathing the winner for being so condescendingly grand about their big moment. It comes off extra phony because we sense that, deep down, the winner isn’t really thrilled with the idea that this honor may lead to millions of other wannabes yapping at their heels and trying to win one.

So what should an award winner say? Well, with the mass audience taking to social networks to dissect every moment of awards shows, speechmaking definitely makes a difference, to the point where a 90-second acceptance can make or break a career almost as much as the award itself can. Anne Hathaway seemed to become significantly less popular because of her breathless laundry lists of names (and by starting her Oscar speech with “It came true”), whereas Meryl Streep has become even more beloved because her speeches are invariably witty, pointed, and also touching. (They should let Meryl win every time, even when she’s not nominated, just so we can hear her talk.)

Meryl knows that an acceptance speech should be sincere yet entertaining, succinct yet somewhat comprehensive, and humble yet confident, and there should also be some real emotion involved. In another seeming contradiction, there needs to be serious thought put into what the winner is saying, but they should also make sure to brim with the spontaneity of the moment. Come on, folks, you’re actors — you can do it.

Glenn Close did brilliantly at the Golden Globes earlier this year, when she was a surprise Best Actress winner for The Wife. Glenn looked shocked when her name was called, yet she quickly composed herself to speak about the themes of the movie and to come off truly grateful and honored. And in framing The Wife as being about a talented woman living in someone else’s shadow, she seemed to herself be crawling out from behind Meryl Streep! It was such a terrific speech that I was sure it clinched Glenn the Oscar, but that instead went to The Favourite’s Olivia Colman, who wasn’t necessarily the favorite, but gave a lovably daffy acceptance that was eccentric and droll.

Alas, instead of speeches like those, we usually get Hathaway-like name checks (“I want to thank my accountant, Jim; my trainer, Joanne…”), speeches that leave out key names (In 2000, when Hilary Swank won her first Oscar, for Boy’s Don’t Cry, she forgot to thank then-hubby Chad Lowe; they eventually split), phony bouts of gushing, self-satisfied preening, fake-spontaneous recitations (“I didn’t plan anything”) that seem to have been rehearsed for months, and canned orations full of platitudes and advice, as if we schlepps out there want nothing more than to someday win Best Lighting in a Musical, and the winner knows just how we can get there.
Read more…

Bundyville: The Remnant, Chapter Four: The Preacher and the Politician

Illustration by Zoë van Dijk

Leah Sottile | Longreads | July 2019 | 27 minutes (7,641 words)

Part 4 of 5 of Bundyville: The Remnant, season two of Bundyville, a series and podcast from Longreads and OPB

I.

To get to the Kingdom of Heaven, drive a long twisting road that dips in and out of wide green fields dotted with hay bales, skim alongside a crooked river and stop at the sign that says Marble Country. A wooden ranch gate — a tall archway of timber and American flags — marks the spot. Keep going past it for 20 more minutes and you’ll leave the country altogether; drive under that gate, and in a way, you’ll leave America, too.

For nearly 30 years, speculation about what goes on beyond the threshold to Marble Country has confused, scared, and angered folks here in Stevens County — a far-flung region of thick forests and dirt roads, cow pastures and low hills deep in the northeastern corner of Washington state.

Before the first barn wall could be raised on the site of a ghost town, people were already whispering. “Religious Group Says Fear Of Cult Unjustified,” a 1992 Associated Press headline read, “Pentecostal Sect Plans To Move Into Ghost Town.”

That religious group, led by a married couple named Barry and Anne Byrd, intended to create its very own Western-themed shining city on the hill: what they termed a “Christian covenant community.” They called it Marble Country, and they built houses and a church — Marble Community Fellowship — and painted “Holy Ghost Town” on an old barn. They raised families, planted crops. It wasn’t just a new town put down in an old place, but an old place resurrected. A brochure said Marble would get into all levels of politics, offer alternative civil courts and an alternative media.

 

Keep the characters of Bundyville: The Remnant straight with this character list.

“We are committed to uniting the generations to labor together to bring the dominion of Christ in every area of life,” the Byrds promised in the brochure.

For most of the time Marble Country has existed, the Byrds have hosted an event each summer called the God and Country Celebration. As the Patriot movement has made more and more headlines — between the standoffs at Bundy Ranch in 2014 and Malheur in 2016, and the subsequent trials — the name Marble kept popping up in my reporting. People who’d once been in the movement told me the festival was a gathering of militia bigwigs, Patriot celebrities, and politicians with extreme beliefs. It sounded like some kind of Patriot Woodstock, but it’s closed to the media, so I couldn’t go see it for myself.

In the summer of 2018, Jeanette Finicum was a “special guest” at the festival, bringing with her the message of her murdered, martyred husband. During the weekend, children in cowboy hats and jeans waved big white flags from the Marble stage bearing her husband’s distinct “LV” cattle brand. 

Finicum chose Marble as one of the first places to screen LaVoy: Dead Man Talking, a multipart film about her husband. There she delivered a speech that differed greatly in tone from the one she gave when I saw her speak in Salem, Oregon, just six months later. Someone sent me a recording of her Marble speech: She wasn’t the diminutive chuck-wagon mom I’d seen in Salem, but a pissed-off activist with a message ready for an audience who cheered her on.

“The media is not in the business of telling the truth,” she spat into the microphone. 

The Marble crowd murmured approval — yes, yes, that’s right, amen.

“Their job, their motive, their mission is to create an illusion in order to blur our reality. I was label-lynched by them as a sovereign citizen, anti-government terrorist. Profiled as a domestic right-wing extremist and judged by the American public for standing with my husband,” she said. She told them she was on a watch list. The feds monitored her home.

She never used that word — lynching — when I saw her speak in Salem, but here, both she and Mark Herr, the film’s producer, spoke it as if it were a word created for them. They have been lynched, they told the crowd, again and again. Lynched

The lynch mob, by their estimation, was the media: inflicting extrajudicial punishment to God-fearing freedom lovers. How dare anyone go after them?

“Your political opponents are using labels and the force of government to lynch you out of existence! What can you do?” Finicum asked. “You can make label-lynching a hate crime.” She told the crowd to lobby state legislators to make Patriots a special class. 

“We should be a protected class,” she yelled. “After all, everyone else is!”

To that, the crowd cheered so loud it was almost hard to hear her anymore. 

***

For decades, Stevens County, where Marble Country is located, has served as somewhat of a wooded, mountainous petri dish for conspiracy theories to grow, flourish, and find new hosts. For most of that time, one daily newspaper reporter was there to document the crimes committed by fringe groups who’ve found haven in the Stevens County’s sparsely populated areas. His name is Bill Morlin, and for decades he worked at the Spokane Daily Chronicle, then The Spokesman-Review. Now in his 70s, I first met him in the federal courtroom during the Bundys’ short-lived trial in Las Vegas. 

In the spring of 2019, I called him up to get a crash course on Stevens County’s right-wing extremist history. Something that may come as a surprise to people who aren’t familiar with the Inland Northwest is that the Northwestern United States isn’t all rain showers and mountains and Nirvana records, coffee shops and weed stores on every corner. 

In fact, Eastern Washington and North Idaho couldn’t be less in line with that image. It’s a deeply conservative area of the West. It’s hot and dry in the summer, cold as hell in the winter. In the past few years, some people have started to call this region the American Redoubt — the nickname survivalists and preppers have given Eastern Washington, Idaho, Montana, and Wyoming, arguing that it’s a safe haven for libertarians. The term was popularized by James Wesley Rawles, who calls the people who migrated there for that reason “the remnant. Libertarians and preppers from around the country have been encouraged to make a home here. There are even “redoubt realtors” who’ll sell you a house, complete with a bomb shelter.

I came to talk to Morlin about Stevens County, but also about this region as a whole. He came prepared for our meeting with three pages, single-spaced, detailing various murders, robberies, kidnappings, and bombings committed by people from the county.

You can’t talk about the violent history of Stevens County without first understanding the Aryan Nations, a neo-Nazi group who had a compound in nearby north Idaho — two hours from Stevens County. It was one of the first violent groups in the Pacific Northwest he recalls writing about. Morlin tells me about a 1983 cross-burning ceremony at the Aryan Nations he covered.

In the late 1970s, Richard Butler, who would become one of the most famous white supremacists in the country, had set up the swastika-emblazoned compound near Hayden Lake, Idaho, attracting racists from every corner of the country to the Idaho Panhandle. Butler allowed Morlin and a photographer to document the event, which the newspaper had been trying to cover, as a way of attempting to understand who, exactly, was gathering at the compound. 

“There was sort of a division, like do we pay these people any attention or do we ignore them?” he recalled of his paper’s coverage of cross burnings. “In fact a columnist at the other newspaper thought we were foolish for writing about the fact that there’d been a cross burning. He was of the school of thought that if you ignore them, they’ll go away, and by writing about them all you’re doing is giving them publicity. 

“I have never to this day signed on to that belief system,” Morlin continued. “Neither do major civil rights organizations. They believe that turning the lights on is the only way you can deal with hate groups.”

The cross burning was called the Blessing of the Weapons and was presided over by former Michigan KKK grand dragon Robert Miles. (In 1973, Miles was convicted of conspiring to bomb ten school buses in Pontiac, Michigan.) 

“It was very uncomfortable,” Morlin said. As the group of 40 to 50 people lit three crosses wrapped in diesel-soaked burlap, “each person in the circle would walk up with with his weapon … knives or handguns or long rifles. And each of them would be blessed by the master of ceremonies. The ceremony was to signify that these people were committing to the white cause and the fight for the white race that they envisioned was coming any day.”

That night, Morlin didn’t know who exactly all those men were that had their guns blessed in the name of a white war — but soon, he would. They would become known as the Order. It was an all-white underground domestic terrorist organization established by an anti-government extremist and racist named Bob Mathews, who had been actively recruiting people to create a “White American Bastion” in the Pacific Northwest and was motivated, in part, by an extremist ideology called Christian Identity. 

It’s an ideology that relies on the belief that Jews are descendants of Cain, and people of color are soulless and “beasts of the field,” while whites are the true “House of Israel.” Some Identity adherents believe Jews are the spawn of Eve and Satan. Butler, too, preached Christian Identity from his very own church at the compound. Around the nation, neo-Nazi groups and the Ku Klux Klan also believed in the radical ideology. 

Nationwide, as violent white supremacist fires flared, Christian Identity — time and time again — was the pitch wood making it burn hot and constant.

The men of the Order met at a cabin on Mathews’s Northeastern Washington property which was located in the county next to Stevens County. They “stood in a circle secretly and pledged a blood oath to each other to jointly fight this race war that they believed was coming,” Morlin told me. 

Morlin believes the men were inspired by a work of racist, apocalyptic fiction, a novel called The Turner Diaries that details a race war, and that, later, compelled Timothy McVeigh to bomb the Alfred P. Murrah Federal Building in Oklahoma City.  

According to Morlin, the men at the ceremony eventually committed “a litany” of violent acts, most notably the 1984 assassination of a Jewish radio host named Alan Berg, who’d mocked a tenet of Christian Identity — that Jews were evil incarnate — on his Denver talk show. They committed a robbery in Spokane, bombed a synagogue in Boise, and robbed armored cars in Seattle. But investigators were baffled, unable to figure out who was responsible for so much violence. 

“This is in an era before the term ‘terrorist’ meant anything to anybody. I mean it’s like ‘Domestic terrorism? What’s that?’” Morlin said.

During a Northern California robbery of several million dollars from an armored car, Mathews left a handgun behind — a mistake that would eventually lead to the downfall of the Order. Mathews died in a shoot-out before the group’s 1985 trial in Seattle, which Morlin covered for the Spokesman-Review.

“A lot of the East Coast networks and newspapers had pretty much ignored the fact that the Order trial had occurred,” he says. “It was really a big deal, but it had happened on the West Coast and it didn’t get the news coverage, in my view, that it would have received if it had been in Florida or New York or Ohio or Pennsylvania.”

In fact, the Order created a new legacy for up-and-coming racists to follow: Today, violent white supremacist groups still cite an adherence to a mission statement called “The 14 Words” — “We must secure the existence of our people and a future for white children” — which was coined by one of the Order’s members. 

The men of the Order weren’t exactly quiet about the ideas that drove them: Mathews and other members of the group were known to convene at a Colorado Christian Identity church led by an anti-Jewish, anti-homosexual, and racist preacher named Pete Peters. Despite its small population, by the 1990s, Stevens County was home to at least two Christian Identity churches: the Ark, near the Canadian border, and another founded by a former Ark acolyte, the Christian Israel Covenant Church. (The Ark is now called Our Place Fellowship; the Christian Israel Covenant Church disbanded in the early 2000s.)

“Those churches taught that white people are the superior race, that Jews are biologically satanic,” Morlin told me. 

The churches were small — and though the pastor at the Ark, Dan Henry, told The Spokesman-Review in 1992 that he rejected the “hate mongering” of the Aryan Nations, he also acknowledged preaching antisemitic ideas. 

But word had gotten around. People knew who was attending services. So it was common knowledge that the couple trying to start that new Christian covenant community called Marble Country — Barry and Anne Byrd — had attended the Ark for years. 

It was like the county knew what was about to happen — that this tiny bastion of hateful ideas was about to cross the rubicon, producing a number of followers who would spill blood in the name of Identity ideology all around the American West.

***

The racist services at The Ark were attended not only by adults who wanted to hear the sermons of Henry and other extremists, but also often by the children of those people, too. Chevie Kehoe fit the profile of one of those kids. Raised in part in Stevens County, his parents, Kirby and Gloria Kehoe, brought their children to services at the Ark, likely around the same time the Byrds attended. As his children grew older, Kirby Kehoe, an adamant racist, grew increasingly skeptical of the government, pulling his kids out of their Colville, Washington, public school, viewing schools “as a threat,” according to his son. In a 1999 New York Times interview, Chevie said his parents were interested in the notion of a whites-only region preached by the Order’s Mathews, and over time Chevie believed that he himself could bring the plan to fruition in the Northwest. He called the region the Aryan People’s Republic, and began committing robberies and acts of violence in devotion to the concept. 

In the late 1990s, he launched a cross-country trip to recruit people to his white region — a trip that turned into a spree of murders, shootings, and robberies.

In 1996, Chevie Kehoe robbed and murdered a man, his wife, and her 8-year-old daughter in Arkansas, then tossed their bodies into the Illinois Bayou. The next year, when police officers in Ohio pulled over Kehoe and his brother, Cheyne, and in two subsequent shoot-outs, Kehoe fired 33 bullets, seriously injuring a pedestrian before fleeing. Both were arrested after a brief manhunt, and Chevie was later sentenced to three consecutive life sentences without the possibility of parole. 

Even decades after Chevie Kehoe’s imprisonment, the whites-only nation idea that invigorated him, Mathews, and the Order before him, would keep surfacing in new ways and in new forms.

Kehoe is now incarcerated at the ADX Florence supermax prison in Fremont County, Colorado, alongside McVeigh’s Oklahoma City bombing accomplice Terry Nichols and 1996 Olympic Park bomber Eric Rudolph, who was inspired by Christian Identity to bomb abortion clinics, a lesbian bar, and the 1996 Olympics in Atlanta.

In 2012, serial killer Israel Keyes, who grew up with the Kehoe brothers and who also occasionally attended the Ark as a child, confessed to committing robberies and murders from coast to coast before reportedly dying by suicide in a jail cell. It’s unclear if his crimes were inspired by any sort of ideology, but during the 1990s, his father wrote a letter of support for both the Byrds and Pete Peters that was published in the local paper.

Keyes wrote that it wasn’t illegal to practice Christian Identity: “It is my understanding that the Marble Community Fellowship has very little to do with the Christian Identity Movement, but so what? Haven’t we as Americans a right to exercise a belief in God and celebrate our white heritage and Christian religion? After all, many Jews consider their race to be God’s chosen people. Is this not racism at its zenith?”

Morlin told me that he reported from a meeting of the Stevens County Assembly — an anti-government militia — in 2012, in which neo-Confederate Pastor John Weaver spoke. Weaver gives racist sermons from the pulpit — sometimes in front of a Confederate flag, sometimes wearing a Confederate flag–printed tie — railing against interracial marriage, and advocating for slavery. By the time of the meeting, he was no stranger to Eastern Washington. In the early 1990s, he appeared at a Spokane conference of white supremacists, during which he promoted his book that urged Americans to break laws should the government become occupied by Jews.

In 2015, Weaver was back in Stevens County to give another speech — this time, he was onstage at Marble Country. 

 

II.

Marble’s God and Country Festival wouldn’t be what it is without a speech from a Washington State House Representative from a district two hours away. 

His name is Matt Shea. A clean-cut Army veteran with a law degree, Shea wears thin glasses, dresses in crisply ironed shirts, and smiles tightly. He positions himself as a voice of rural people, but actually represents a district that includes Spokane Valley, a largely suburban city of almost 100,000. 

Rep. Matt Shea at a January 2017 gun-rights rally in Olympia, Washington. (AP Photo/Ted S. Warren, File)

Shea, over the course of six two-year terms, has become a fixture at the far-right edge of what Washingtonians consider Republican. He rarely speaks to reporters — unless they work for publications that have the words “liberty” or “redoubt” in their name. I know more people who’ve done in-person interviews with President Trump than with State Representative Shea, and for years, I worked at newspapers that covered his district. 

In order for Shea’s constituents to get an understanding of his ideas, they need to tune into his podcast. The show always takes the same format: Shea reads off some headlines from right-wing news sites, then interviews a guest, while often piping up in agreement with their outlandish theories. 

Those guests tend to hold views reflected in the bills Shea introduces in the Washington House. They’re unflinching Second Amendment advocates. This spring, a woman on the program preached abstinence-only sex education and an anti-vaccine “researcher” claimed that child immunizations are contaminated with aborted fetuses. 

Mostly, they’re conspiracy theorists and bigots with views Shea parrots. This spring, the legislator hosted a representative from an anti-abortion and homophobic group that has participated in burnings of the Quran. He interviewed a man who spouted talking points from conspiracists who believe in Agenda 21 — a theory that sustainable development is a shady plan hatched by a “New International Economic Order” to control people and take their freedom. Recently, he hosted a conspiracy theorist who believes the 9/11 World Trade Center attacks were actually a “controlled demolition.” 

You could say Shea is a lot like Bill Keebler — except he wears a suit and taxpayers pay him a salary. 

Shea, for years, has seemed at home among the creators of fake news and conspiracy theories that turn violent. As early as 2009, he made several appearances on conspiracy king Alex Jones’s InfoWars show, where Jones introduced him with reverence. “Representative,” he says, “good to have you on with us.” In that February 2009 interview, Shea and Jones spoke of their belief that the federal government was setting up camps to imprison Americans. 

It seems as though in Shea’s world, the country is on the verge of collapse. People will have to fight for their lives. And he intends to be prepared: “If you do not have 5,000 rounds of .223, 5,000 rounds of .22 and a thousand rounds of handgun ammo as a minimum, you’re wrong!” he called from an Idaho stage in 2013. 

“We want to prepare for the inevitable collapse that’s gonna happen. And yes, I said that as a politician here onstage. It’s gonna happen! We all know that! The question is, and I think the question should be for all of us, what are we gonna do afterwards? What are we gonna do with that opportunity?”

Apocalypse, government collapse, anarchy — in his world, these are exciting prospects. Opportunities even. A chance at a fresh start, a time to get society back on track. 

In this fantasy apocalypse, perhaps being well-prepared and well-armed will be so necessary that the person you were in the past — in the pre-collapse — won’t matter. Money will be obsolete. Laws won’t be enforced. Maybe a violent past will suddenly be seen as an asset. 

This might have special appeal for Shea. His ex-wife, who filed for divorce in 2007, alleged that Shea grabbed her so hard during two arguments that he left bruises on her arms. In those same divorce filings, she told stories of a controlling man; by her account, he commanded her to always walk on his left side because a soldier needs to be able to draw his sword from the right. (Shea was in the Army and served in combat, but his wife said he did not traditionally carry a sword.)

Shea did not respond to requests for comment, but when asked a decade ago about his divorce by the Spokesman-Review, he denied any violence and said, “I love my wife and, when I married, I intended it to be for life. Unfortunately, my former wife didn’t and decided to pursue her third divorce.”

In 2011, Matt Shea was involved in a road rage incident in Spokane, in which another driver alleged Shea pulled a gun. In a police report, Shea told officers that as an Iraq war veteran he had to use “evasive techniques” to avoid hitting the man’s car (which Shea described as engaging in “Baghdad driving”), and proceeded to follow it. Shea admitted to officers that he had a gun in his car, that he produced it from a glovebox during the incident, and that he had an expired concealed carry permit. The other driver said he saw the handgun and was afraid Shea was going to shoot him. Later, Shea’s attorney made a deal with prosecutors that resulted in the charges being dropped.

Even now, in a time he surmises is the end of civil society, all of this has become standard Shea stuff. None of his past did real damage to his standing with voters. But it didn’t mean the things he said didn’t set people on edge. 

In the spring of 2014, a woman was eating at a Spokane Valley Mexican restaurant when she overheard a conversation between two men at the next table over. Later, she found out those men were Shea and the head of the Oath Keepers militia, Stewart Rhodes. 

But sitting there, hearing them, she became so concerned over what they were saying that she took their picture and called the police. According to a police dispatch, the woman overheard “a conversation from a group of males talking about snipers, Clive [sic] Bundy, and public militias.” One of the individuals, she told the police, had “thermal imaging binoculars,” and the group sounded “like they were planning something.”

Still, Shea won the election that year with 57 percent of the vote. 

If he could sit in a diner with one of the biggest militia leaders in this country and openly talk about military tactics, it seemed like Shea could be as extreme as he wanted — and it wouldn’t cost him any support. And even some of the most conservative Republicans in Eastern Washington were baffled by how Shea stayed in office. 

Two of those people are Sheriff Ozzie Knezovich and a former Republican state legislator from Stevens County, John Smith. In a three-part podcast on white supremacy in the region, the pair suggested that Shea’s involvement at Marble Country was something voters should worry about. It was a part of a deep history of racism and hate that had found a home in this region going way back.

Smith was raised by his grandparents in southern Idaho — and his grandfather was friends with people in the Aryan Nations and in the Order. Their home often had new people coming through the door. He remembered his grandfather laying maps out in the kitchen nook and drawing up plans for “an armed revolt.” 

Smith realized on his own the ideology he’d been raised around was rotten and that he had to find a way out of it. He took a job as a ranch hand when he was 16 years old, and as a young adult, he attended church at the Ark. He was later married there, though he says he and his wife have since cut their connections with the church. 

But he told me that it’s become something of a mission for him to speak up when he sees ideas rooted in Christian Identity catching on here. Stevens County has a history — he knows it, everyone does, even though racists have always been a fringe minority. And in a podcast with Knezovich, Smith hoped people would hear stories of his childhood as a cautionary tale. 

“I grew up in that environment, and that stuff doesn’t wash off you. I acknowledge that darkness might still be inside me,” he told me. He maintains that he’s constantly trying to make sure he’s free of it, that he root out any part of him that might still carry what he learned as a kid — asking friends who aren’t white, who weren’t raised around neo-Nazis, if he’s changed. 

“I actively go to them and say, ‘Look at me and tell me, is it still in me? Am I still saying the wrong things? Am I still thinking of this in the wrong way?’ I’m trying to not have that be in there anymore. And maybe part of that is standing up and saying this is not OK.” 

Smith, in the video versions of the podcast was small and diminutive next to Knezovich. The latter is a tall, hulking man with a bald head and a sidearm, who shook my hand firmly and didn’t smile once when I interviewed him in a conference room at the Spokane County Sheriff’s office last summer. 

He told me he sees Shea’s increasingly conspiratorial rhetoric and the allegations of aggressive behavior against him through a lens of one reality his department deals with regularly: that racism is alive and well in his county. He talked about getting a call one morning that KKK flyers had appeared plastered all over a suburb called Millwood, and about teenagers spouting white nationalist talking points in the hallways of local high schools. 

He also talked about threats. Since Knezovich — a member of the local Republican party and a man who twice endorsed Shea — started speaking up about Shea, he has received death threats from people associated with the legislator. 

“I’ve got my estate in order. I’ve got my will done. The kids have all been briefed. And don’t take this as me being flippant. Nobody wants to die. I came to grips with death a long, long time ago,” he says. “And there’s been more people than I that have died for this country. And if that’s what it takes for people to wake up to what’s happening around them. All right. I love my nation. And if it takes fighting these people on these terms? Bring it on.”

***

In 2015, Shea was at the God and Country Celebration again, this time next to John Weaver — the neo-Confederate preacher. The next year, many of the legislators from around the West who sympathized with the Bundys in both 2014 and 2016 showed up to Marble, too. 

In some years, Anne Byrd posted photos to Facebook of the people who came to Marble. In the caption of a picture of Val Stevens, a former Washington state rep, Byrd wrote that Marble was “blessed” for legislators to be “standing in the gap” for the people.

By the summer of 2018, in the months before the election when many legislators campaign in their districts, Matt Shea appeared alongside Jeanette Finicum at the God and Country Festival. He talked about an idea he’d been shopping for years in the Washington statehouse: He wanted to secede Eastern Washington and create “a safe haven,” a 51st state called Liberty. 

Shea insisted people east of the Cascades just didn’t agree with the values of “downtown Seattle,” so why even try to get along? “I would submit, here in Eastern Washington, we believe in the right of self defense. We also believe the constitution means what it says,” he told another crowd. Seattle doesn’t because, he says, it is filled with communists. “And communism, real communism, has killed more people as an ideology than any other ideology in this history of the world — atheist communism.” 

All this time Shea spent up here in Stevens County, far from his district, he wasn’t recruiting any new voters. But it did appear he was amassing a following for a political movement, of which he was a leader and visionary. 

I wanted to ask him about that, but last summer he didn’t respond to my email requests for an interview. In his personal security detail (having one is atypical for a state rep), Shea is known to employ a man who lives at Marble, and who once tried to bring an AK-47 onto the grounds of the Spokane federal courthouse, but he has no press liaison. 

So I figured if I really wanted to ask him a question, and get any kind of an answer, I should show up to a gun rally where he was slated to be a featured speaker.

It was a hot August day — a dry heat, as people in Eastern Washington like to say. The rally was to be held at a large, grassy green park on the northside of Spokane — much closer to his district than Stevens County, but still not in it. A place where people play softball and lay out picnics. On this day, a small crowd gathered. For the most part, they wore shirts emblazoned with proclamations of love for guns and freedom, but several wore militia gear and carried militia flags. Several carried AR-15s.

I listened to Shea give a speech,  one that would go on to make headlines around the West, in which he called journalists “dirty, godless, hateful people.” The small crowd — which included leaders and members of the 63rd Lightfoot militia and a local politician who once stomped on the United Nations flag in front of Spokane City Hall — loved it. They cheered Shea on as he yelled, wide-eyed, pumping his fists. 

When he was finished, I trudged across the grass, introduced myself, and said I was hoping to ask him some questions: about this 51st State idea and his affinity for speaking at Marble each year. To my surprise, he agreed to talk. 

Read more…

Live Through This: Courtney Love at 55

Mick Hudson / Getty, istock / Getty Images Plus, Michael Ochs Archive / Getty, Vinnie Zuffante / Getty, pidjoe / Getty, Illustration by Homestead

Lisa Whittington-Hill | Longreads | July 9th, 2019 | 24 minutes (6,539 words)

It’s hard to tell whether Thurston Moore is being sarcastic or sincere. It’s probably a bit of both. “The biggest star in this room is Courtney Love,” says the Sonic Youth singer and guitarist in a scene from 1991: The Year Punk Broke. The documentary follows Sonic Youth’s summer 1991 European tour and features performances and backstage antics from their tourmates, including a pre-Nevermind Nirvana, Babes in Toyland, and Dinosaur Jr.

Moore comments during an interview with 120 Minutes, an MTV program that spotlighted alternative music in the days before the music channel became the home of teen moms and spoiled Laguna Beach brats. As Moore declares his love of English food to the host — most definitely sarcasm — Love is behind him trying to get the camera’s attention. She waves and appears to stand on something to make herself taller. Her efforts pay off and soon she is in front of the host, all brazen, blond, and sporting blue baby doll barrettes.

Tongue-in-cheek or not, Moore was right. Love’s band Hole wasn’t on the European tour bill that summer and their debut album Pretty on the Inside hadn’t even been released yet, but Love was already on MTV.

Read more…

It’s Like That: The Makings of a Hip-Hop Writer

T-Neck Records, 4th & B'way, Jive, Profile Records, Ruffhouse Records

Michael A. Gonzales | Longreads | June 2019 | 45 minutes (7,644 words)

 

Recently a friend told me, “When I was a newbie at Vibe magazine, I always thought, Mike looks like what I always imagined a real writer looked like, with your trenchcoat and briefcase and papers … and your hats. I can’t forget the hats.” Though he did forget the Mikli glasses and wingtips, I had to confess my style was one I’d visualized years before when I was a Harlem boy hanging out in the Hamilton Grange Library on 145th Street, looking at Richard Wright, Chester Himes, and James Baldwin book jacket pictures.

Read more…

True Roots

Daniel Berehulak/Getty Images

Ronnie Citron-Fink | True Roots | Island Press | June 2019 | 34 minutes (5.655 words)

 

How’d You do it? Are you doing that on purpose? Are you okay? Ever since I stopped coloring my silver hair, I’ve gotten a lot of questions. One of the most common during my hair transition was Why are you letting it go gray? While my roots didn’t ask permission before they stopped growing in dark brown, it was a complex mix of fear and determination that rearranged my beauty priorities. The question of why — why, after twenty-five years of using chemical dyes, I gave them up-is something I’ve thought about a lot.

My world began to shift four years ago. I was sitting in a meeting about toxics reform in Washington, DC, when an environmental scientist began to describe the buildup of chemicals in our bodies. As she rattled off a list of ingredients in personal care products-toluene, benzophenone, stearates, triclosan — my scalp started to tingle. “We’re just beginning to understand how these chemicals compromise long-term health,” she concluded.

Read more…

And What of My Wrath?

Illustration by Zoë van Dijk

Sara Fredman | Longreads | May 2019 | 9 minutes (2,555 words)

 

What makes an antihero show work? In this Longreads series, It’s Not Easy Being Mean, Sara Fredman explores the fine-tuning that goes into writing a bad guy we can root for, and asks whether the same rules apply to women.

I didn’t want to write about Game of Thrones. Truly, I didn’t. In the first place, it is an ensemble show and therefore not technically an antihero vehicle. It is also generally the realm of the hot take and this series is usually a place for tepid, if not downright frigid, takes. It is Winterfell, not Dorne. But here we are in Dorne, talking about Game of Thrones, though probably a week or so after it would have been maximally festive. So maybe it’s more accurate to say that we’re in King’s Landing, which is perfect because we’re here to talk about how, on any other show, Cersei Lannister could have been the female antihero we’ve all been waiting for.

Cersei is the closest female analogue to the Golden Age antiheroes who turned the genre into a phenomenon. Those men — Tony Soprano, Don Draper, Walter White — all do terrible things for a host of reasons: because they want to, because power feels good, because they’re doing what they need to do to survive in the world. Despite the fact that these men do terrible things, we root for them because of a careful calibration of their characters and the environment in which they operate. They are marked as special, or especially skilled; they are humanized by their difficult pasts and their dedication to their children; and, finally, they are surrounded by other, more terrible people. Cersei has, at one point or another in the show’s eight-season run, fallen into all of these categories. She is smart and cunning. I recently rewatched a scene I had forgotten, early in the first season in which she pokes holes in the plan her dumb and petulant son Joffrey comes up with to gain control of the North. The scene shows us that she understands the stakes of the titular game and how to play it successfully: “A good king knows when to save his strength and when to destroy his enemies.” The audience knows that Joffrey can never be that king and, despite Cersei’s keen grasp of her political landscape, neither can she. She may be depicted as a villain throughout most of the series but she is also clearly a talent born into the wrong body, and she knows it. As she says to King Robert Baratheon: “I should wear the armor and you the gown.”

This brings us to our next antihero criterion, which is the humanizing influence of interiority and family. It is axiomatic among the show’s characters and creators that Cersei’s most humanizing characteristic is the love and dedication she shows her children. In their final scene together, her brother Tyrion begs her to surrender with the only card he believes will matter: “You’ve always loved your children more than yourself. More than Jaime. More than anything. I beg you if not for yourself then for your child. Your reign is over, but that doesn’t mean your life has to end. It doesn’t mean your baby has to die.” In showrunner David Benioff’s view, Cersei’s children were the only thing that could humanize her: “I think the idea of Cersei without her children is a pretty terrifying prospect because it was the one thing that really humanized her, you know — her love for her kids. As much of a monster as she could sometimes be, she was a mother who truly did love her children.”

But the thing about an antihero show is that it can turn any monster into a hero.

It is of course true that Cersei loves her children, but it is hard to square Tyrion’s description of his sister with the Cersei of season two’s “Blackwater” who was prepared to kill herself and Tommen, her youngest son, rather than be taken alive by Stannis Baratheon and his army. Tyrion thinks that Cersei loves her children like a June Cleaver when she actually loves them like a Walter White. For the antihero, love of family is about self-advancement, not self-sacrifice. Invoking his children will not dissuade him from doing bad things because their existence is the very thing that motivates him to do them. This is why Walter White can yell “WE’RE A FAMILY” right before he takes his infant daughter away from her mother.

David Benioff’s assertion that Cersei’s love of her children is the only thing that humanizes her is possibly the best example of the way in which the Game of Thrones writers misunderstood their characters and their audience. It overlooks the other reasons the show gave us to root for Cersei and betrays an ignorance of the extent to which enduring patriarchy might itself be, for at least a portion of its audience, humanizing. It reveals an inability to grasp the possibility that the mother and the monster can be the same person. For a show dedicated to demonstrating just how thin the line is between good and evil, Game of Thrones was surprisingly blind to Cersei’s potential to become a compelling antihero, to be humanized by something other than her children. Or maybe the show realized it all too well.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Seasons five and six in particular could have been a — forgive me — game changer for the audience’s relationship with Cersei. Their storyline has Cersei first trying to manipulate and then fighting off a band of homophobic and misogynist religious ascetics called the Sparrows. Initially, the audience appreciates the way the High Sparrow thwarts Cersei’s attempts to use religion to strengthen her own political position. She’s been a villain for four seasons and we relish seeing her hit a roadblock. But the High Sparrow and his sidekick Septa Unella take it too far and our allegiances begin to shift. Septa Unella tortures Cersei in prison and the High Sparrow declares that Cersei must take a walk of penance through the streets of King’s Landing. Her hair is shorn and she walks naked from the Sept of Baelor to the Red Keep as Septa Unella chants “shame” and rings a bell to draw onlookers. In that sequence, we don’t forget that Cersei’s done terrible things, but we feel sympathy for her because she is, in that moment, at the mercy of other, more sinister forces. We also feel sympathy for her because this showdown with the High Sparrow reminds us that her story is that of a woman living under patriarchy, that her autonomy has always been contingent and therefore largely an illusion. We remember that this is not the first time Cersei has been powerless, that in the first season we saw her husband hit her and then tell her to wear her bruise in silence or he would hit her again. We remember the way her father, Tywin Lannister, spoke to her (“Do you think you’ll be the first person dragged into the Sept to be married against her will?”), and we also remember that she was raped by the one man she loved next to the body of her murdered son.

In most of the ways that matter, Cersei’s relationship with Sansa Stark, betrothed to marry Cersei’s abusive son Joffrey, is evidence of her villainy but it is also a frank education in what becoming a wife and mother means under patriarchy. Looking back on some of their scenes together, one gets the sense that Cersei feels compelled to explain to Sansa what she’s in for, to disabuse her of any notions of happily ever after and replace them with the reality of life as a political pawn, a prisoner in expensive dresses. We see this as coldhearted and evil because we hold out hope that Sansa will be able to remain an innocent princess looking for true love, but that’s not an option for girls like her, and Cersei knows it. In a heart-to-heart after Sansa gets her period for the first time, Cersei assures her that while she will never love the king, she will love her children. Sansa has just become a woman, which makes her eligible to be a wife and mother. Cersei knows that this is an occasion for a political lesson rather than a domestic one: “Permit me to share some womanly wisdom with you on this very special day. The more people you love, the weaker you are. You do things for them that you know you shouldn’t do, you’ll act the fool to make them happy, to keep them safe. Love no one but your children. On that front, a mother has no choice.” When we hear it from her own mouth, Cersei’s love for her children sounds less like deliberate self-sacrifice than yet another matter in which she has no choice.

Tyrion thinks that Cersei loves her children like a June Cleaver when she actually loves them like a Walter White.

It’s probably worthwhile to remember that the “game” we have spent eight years watching is only being played in the first place because Robert Baratheon assumed that a woman who left him had to have been taken (“I only know she was the one thing I ever wanted and someone took her away from me”). Women are things to be taken and traded; they are the tools men use to cement alliances and consolidate power. Freedom of movement and freedom of self-determination are precious commodities to which only some people in Westeros have access, either by birth or cunning. None of those people are women. Cersei is hardly the only victim of patriarchy on the show, but she could have been its most symbolic. More than anything, Cersei wants to control her own body and her own destiny. She wants to be a player, rather than a pawn. When Ned Stark confronts her about her relationship with Jaime and the illegitimacy of their children, he warns, “Wherever you go, Robert’s wrath will follow you.” Cersei replies, “And what of my wrath, Lord Stark?” This question is, of course, rhetorical — everyone knows that a woman’s anger only earns 78 cents on the dollar. We side with Ned, but on another show, Cersei’s question could have been a rallying cry. We might have written it on signs taken to #resistance rallies and anti-abortion protests. Neither Cersei nor Robert has been faithful, but Robert’s anger matters more because he is the king and Cersei’s infidelity matters more because her body is for making him a bloodline.

The Sept of Baelor pyrotechnics in the season six finale could have easily been Cersei’s “Face Off” moment: a shocking triumph over her enemies showcasing her intelligence and tactical skill. The move was not only brilliantly efficient, killing off everyone who opposed her at once without leaving home, but also bursting with symbolism. She destroys the religious cult that stripped her of what little bodily and political autonomy she had and blows up the place where she married Robert and was raped by Jaime. Cersei watches from her window as the architectural incarnation of patriarchy goes up in green flames and then takes a sip of wine.

That masterfully shot suspenseful sequence is immediately followed by Cersei’s vengeful speech to her torturer, Septa Unella, before leaving her in the hands of Gregor Clegane:

“Confess, it felt good, beating me, starving me, frightening me, humiliating me. You didn’t do it because you cared about my atonement, you did it because it felt good. I understand. I do things because they feel good. I drink because it feels good. I killed my husband because it felt good to be rid of him. I fucked my brother, because it feels good to feel him inside of me. I lie about fucking my brother, because it feels good to keep our son safe from hateful hypocrites. I killed your High Sparrow, and all his little sparrows, all his septons and all his septas, all his filthy soldiers because it felt good to watch them burn. It felt good to imagine their shock and their pain. No thought has ever given me greater joy. Even confessing feels good under the right circumstances.”

Cersei is hardly the only victim of patriarchy on the show, but she could have been its most symbolic.

This is Cersei’s “I am the one who knocks” speech, the moment where the antihero lays bare her unsavory machinations, and we applaud because a formerly weak person now has some hard-won power. Walter White takes some time to understand that if he is to have any power, he must take it. Cersei has always understood that power is her only available means toward self-determination, a ballast against the whims and wishes of those who would try to use her to further their own storylines and try to capture a bigger piece of the Westeros pie. Power is, for her, a necessity rather than a perk. Thinking about Cersei as an antihero, however brief the time we spend cheering her on, makes clear the extent to which writing a successful antihero always involves portraying that character as but a small player in a much bigger game. This is Walter White up against Big Pharma, which cut him out of profits to which he feels entitled and is now forcing him to forfeit his family’s financial security to stay alive. It is Tony Soprano chafing against RICO and the possibility that anyone in his orbit could help the FBI lock him up. It is Don Draper trying to hold on to a life he was never supposed to have. And it is Philip and Elizabeth Jennings doing the job they were trained to do, while people we never see change the rules and determine its stakes. An antihero isn’t on top of the world but right there in the melee, jockeying for some small measure of self-determination. We realize, as they do, that no matter how much power or control they seem to have, they are only one step away from being literally or metaphorically paraded through the streets naked while someone rings a bell.

Cersei is the closest we’ve come to a female version of this kind of character. David Benioff is right: Cersei is a monster. But the thing about an antihero show is that it can turn any monster into a hero. It compels us to root for a monster by making us see the monstrosity lurking all around him and, in so doing, turns him into our monster. Monstrosity in Westeros is like wildfire under King’s Landing: There is more than enough of it to make Cersei a queen we root for while she sips her celebratory wine. Allowing Cersei to become a full-on antihero could have been incredible, giving the show an opportunity to explore the particular powerlessness of women under patriarchy. What difference does motherhood make? What particular vulnerabilities does it bestow, what kinds of unexpected powers or motivations? But this is the fantasy world we have, not the one we need, and Game of Thrones could never allow Cersei to fully become the antihero character they had temporarily conjured. Three weeks ago — on Mother’s Day no less — we saw her crushed by a building, dying in the arms of her rapist after begging him not to let her die. As bad as Game of Thrones was at writing women, it gave us one possible roadmap for creating a female antihero on par with the bad men we’ve seen win Emmys over the past two decades. But it also makes clear just how tough that road is to travel because it requires that we expand our idea of what kinds of people are allowed to do bad things in pursuit of their own self-determination, to become the one who knocks.

Next, we’ll dive into half-hour television for our first solo female antihero — single mom Sam Fox of Better Things — because there’s no audience more adept at pointing out a woman’s flaws than her children.

* * *

Previous installments in this series:
The Blaming of the Shrew
The Good Bad Wives of Ozark and House of Cards
Mother/Russia

* * *

Sara Fredman is a writer and editor living in St. Louis. Her work has been featured in Longreads, The Rumpus, Tablet, and Lilith.

Editor: Cheri Lucas Rowlands
Illustrator: Zoë van Dijk

The Joy of Watching (and Rewatching) Movies So Bad They’re Good

Wiseau-Films, Warner Bros, American International Pictures, Quintet Productions, Four Leaf Productions, Mid-America Pictures

Michael Musto | Longreads | Month 2019 | 8 minutes (2,090 words)

 

I’ve known about the power of good/bad movies since I was a kid, but I was reminded of it just a few days after 9/11, when I went to a press screening of Mariah Carey’s unwitting classic Glitter.

Naturally, New York City was traumatized, many of us going through the motions in a daze as we tried to make sense of the horror. But we had to make a living, so, along with a handful of other arts journalists, I dragged myself to the screening, not sure of what we were getting into. It turned out to be the hackneyed story of a DJ who tries to lift a backup singer (Mariah) up from her humble roots through song and romance. And it was evident quickly into the film that Mariah just didn’t have the acting chops; the new Meryl Streep this wasn’t. We uncomfortably sat there watching the pop diva try to act, but eventually we couldn’t hold back, and a few of her line readings were greeted with titters — the first time I’d heard laughter (including my own) since 9/11. It sounded both shocking and very welcome, and the unintended reaction mounted during a ludicrous scene where Mariah and the DJ were magically thinking of the same melody. By the end, when Mariah spills out of a limo in a glittery gown to visit her dirt-poor mother, we were all screaming in hilarity. This was just the catharsis we needed, and it generously helped us bond and move on.

Read more…

‘There’s Virtually No Conversation In Chicago … About the Aftershocks of the Violence.’

Residents, activists, and friends and family members of victims of gun violence march down Michigan Avenue carrying nearly 800 wooden crosses bearing the names of people murdered in the city in 2016 on December 31, 2016 in Chicago. (Scott Olson / Getty)

Hope Reese | Longreads | April 2019 | 11 minutes (3,002 words)

 

In recent years Chicago has had more homicides than any other city in America. From 1990-2010, roughly 14,000 people were killed there — more than the combined number of US soldiers killed in Iraq and Afghanistan, giving a horrifying legitimacy to the city’s infamous nickname Chiraq. It’s not clear, exactly, why this is so — the rest of the country is experiencing a period of historically low crime. In fact, Chicago contributed nearly half of the country’s overall uptick in homicides in 2016.

Veteran reporter Alex Kotlowitz, author of the bestseller There Are No Children Here and producer of the award-winning documentary The Interrupters, has been chronicling the effects of violence on the city’s neighborhoods for decades. Kotlowitz, whose recent book, An American Summer: Love and Death in Chicago, presents the cumulative effects of violence on the city through 14 vignettes. “For reasons I don’t fully understand, we just seem to be in the place where we have this extraordinarily tragic [violence],” he tells me. “Anybody who tells you they found the answer is just lying to you. Because nobody really knows.”

The book documents the complicated relationships between victims and perpetrators, the nature of the killing — how it is often cyclical and retributive — the way that violence scars communities, and his awe at surviors’ resiliency. Read more…

How the Guardian Went Digital

Newscast Limited via AP Images

Alan Rusbridger | Breaking News | Farrar, Straus and Giroux | November 2018 | 31 minutes (6,239 words)

 

In 1993 some journalists began to be dimly aware of something clunkily referred to as “the information superhighway” but few had ever had reason to see it in action. At the start of 1995 only 491 newspapers were online worldwide: by June 1997 that had grown to some 3,600.

In the basement of the Guardian was a small team created by editor in chief Peter Preston — the Product Development Unit, or PDU. The inhabitants were young and enthusiastic. None of them were conventional journalists: I think the label might be “creatives.” Their job was to think of new things that would never occur to the largely middle-aged reporters and editors three floors up.

The team — eventually rebranding itself as the New Media Lab — started casting around for the next big thing. They decided it was the internet. The creatives had a PC actually capable of accessing the world wide web. They moved in hipper circles. And they started importing copies of a new magazine, Wired — the so-called Rolling Stone of technology — which had started publishing in San Francisco in 1993, along with the HotWired website. “Wired described the revolution,” it boasted. “HotWired was the revolution.” It was launched in the same month the Netscape team was beginning to assemble. Only 18 months later Netscape was worth billions of dollars. Things were moving that fast.

In time, the team in PDU made friends with three of the people associated with Wired. They were the founders, Louis Rossetto, and Jane Metcalfe; and the columnist Nicholas Negroponte, who was based at the Massachusetts Institute of Technology and who wrote mindblowing columns predicting such preposterous things as wristwatches which would “migrate from a mere timepiece today to a mobile command-and-control center tomorrow . . . an all-in-one, wrist-mounted TV, computer, and telephone.”

As if.

Both Rossetto and Negroponte were, in their different ways, prophets. Rossetto was a hot booking for TV talk shows, where he would explain to baffled hosts what the information superhighway meant. He’d tell them how smart the internet was, and how ethical. Sure, it was a “dissonance amplifier.” But it was also a “driver of the discussion” towards the real. You couldn’t mask the truth in this new world, because someone out there would weigh in with equal force. Mass media was one-way communication. The guy with the antenna could broadcast to billions, with no feedback loop. He could dominate. But on the internet every voice was going to be equal to every other voice.

“Everything you know is wrong,” he liked to say. “If you have a preconceived idea of how the world works, you’d better reconsider it.”

Negroponte, 50-something, East Coast gravitas to Rossetto’s Californian drawl, was working on a book, Being Digital, and was equally passionate in his evangelism. His mantra was to explain the difference between atoms — which make up the physical artifacts of the past — and bits, which travel at the speed of light and would be the future. “We are so unprepared for the world of bits . . . We’re going to be forced to think differently about everything.”

I bought the drinks and listened.

Over dinner in a North London restaurant, Negroponte started with convergence — the melting of all boundaries between TV, newspapers, magazines, and the internet into a single media experience — and moved on to the death of copyright, possibly the nation state itself. There would be virtual reality, speech recognition, personal computers with inbuilt cameras, personalized news. The entire economic model of information was about to fall apart. The audience would pull rather than wait for old media to push things as at present. Information and entertainment would be on demand. Overly hierarchical and status-conscious societies would rapidly erode. Time as we knew it would become meaningless — five hours of music would be delivered to you in less than five seconds. Distance would become irrelevant. A UK paper would be as accessible in New York as it was in London.

Writing 15 years later in the Observer, the critic John Naughton compared the begetter of the world wide web, Sir Tim Berners-Lee, with the seismic disruption five centuries earlier caused by the invention of movable type. Just as Gutenberg had no conception of his invention’s eventual influence on religion, science, systems of ideas, and democracy, so — in 2008 — “it will be decades before we have any real understanding of what Berners-Lee hath wrought.”

The entire economic model of information was about to fall apart.

And so I decided to go to America with the leader of the PDU team, Tony Ageh, and see the internet for myself. A 33-year-old “creative,” Ageh had had exactly one year’s experience in media — as an advertising copy chaser for The Home Organist magazine — before joining the Guardian. I took with me a copy of The Internet for Dummies. Thus armed, we set off to America for a four-day, four-city tour.

In Atlanta, we found the Atlanta Journal-Constitution (AJC), which was considered a thought leader in internet matters, having joined the Prodigy Internet Service, an online service offering subscribers information over dial-up 1,200 bit/second modems. After four months the internet service had 14,000 members, paying 10 cents a minute to access online banking, messaging, full webpage hosting and live share prices.

The AJC business plan envisaged building to 35,000 or 40,000 by year three. But that time, they calculated, they would be earning $3.3 million in subscription fees and $250,000 a year in advertising. “If it all goes to plan,’ David Scott, the publisher, Electronic Information Service, told us, ‘it’ll be making good money. If it goes any faster, this is a real business.”

We also met Michael Gordon, the managing editor. “The appeal to the management is, crudely, that it is so much cheaper than publishing a newspaper,” he said.

We wrote it down.

“We know there are around 100,000 people in Atlanta with PCs. There are, we think, about one million people wealthy enough to own them. Guys see them as a toy; women see them as a tool. The goldmine is going to be the content, which is why newspapers are so strongly placed to take advantage of this revolution. We’re out to maximize our revenue by selling our content any way we can. If we can sell it on CD-ROM or TV as well, so much the better.”

“Papers? People will go on wanting to read them, though it’s obviously much better for us if we can persuade them to print them in their own homes. They might come in customized editions. Edition 14B might be for females living with a certain income.”

It was heady stuff.

From Atlanta we hopped up to New York to see the Times’s online service, @Times. We found an operation consisting of an editor plus three staffers and four freelancers. The team had two PCs, costing around $4,000 each. The operation was confident, but small.

The @Times content was weighted heavily towards arts and leisure. The opening menus offered a panel with about 15 reviews of the latest films, theatre, music, and books – plus book reviews going back two years. The site offered the top 15 stories of the day, plus some sports news and business.

There was a discussion forum about movies, with 47 different subjects being debated by 235 individual subscribers. There was no archive due to the fact that — in one of the most notorious newspaper licensing cock-ups in history — the NYT in 1983 had given away all rights to its electronic archive (for all material more than 24 hours old) in perpetuity to Mead/Lexis.

That deal alone told you how nobody had any clue what was to come.

We sat down with Henry E. Scott, the group director of @Times. “Sound and moving pictures will be next. You can get them now. I thought about it the other day, when I wondered about seeing 30 seconds of The Age of Innocence. But then I realized it would take 90 minutes to download that and I could have seen more or less the whole movie in that time. That’s going to change.”

But Scott was doubtful about the lasting value of what they were doing — at least, in terms of news. “I can’t see this replacing the news- paper,” he said confidently. “People don’t read computers unless it pays them to, or there is some other pressing reason. I don’t think anyone reads a computer for pleasure. The San Jose Mercury [News] has put the whole newspaper online. We don’t think that’s very sensible. It doesn’t make sense to offer the entire newspaper electronically.”

We wrote it all down.

“I can’t see the point of news on-screen. If I want to know about a breaking story I turn on the TV or the radio. I think we should only do what we can do better than in print. If it’s inferior than the print version there’s no point in doing it.”

Was there a business plan? Not in Scott’s mind. “There’s no way you can make money out of it if you are using someone else’s server. I think the LA Times expects to start making money in about three years’ time. We’re treating it more as an R & D project.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


From New York we flitted over to Chicago to see what the Tribune was up to. In its 36-storey Art Deco building — a spectacular monument to institutional self-esteem — we found a team of four editorial and four marketing people working on a digital service, with the digital unit situated in the middle of the newsroom. The marketeers were beyond excited about the prospect of being able to show houses or cars for sale and arranged a demonstration. We were excited, too, even if the pictures were slow and cumbersome to download.

We met Joe Leonard, associate editor. “We’re not looking at Chicago Online as a money maker. We’ve no plans even to break even at this stage. My view is simply that I’m not yet sure where I’m going, but I’m on the boat, in the water — and I’m ahead of the guy who is still standing on the pier.”

Reach before revenue.

Finally we headed off to Boulder, Colorado, in the foothills of the Rockies, where Knight Ridder had a team working on their vision of the newspaper of tomorrow. The big idea was, essentially, what would become the iPad — only the team in Boulder hadn’t got much further than making an A4 block of wood with a “front page” stuck on it. The 50-something director of the research centre, Roger Fidler, thought the technology capable of realizing his dream of a ‘personal information appliance’ was a couple of years off.

Tony and I had filled several notebooks. We were by now beyond tired and talked little over a final meal in an Italian restaurant beneath the Rocky Mountains.

We had come. We had seen the internet. We were conquered.

* * *

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry. We had met navigators and pioneers. They could dimly glimpse where the future lay. Not one of them had any idea how to make a dime out of it, but at the same time they intuitively sensed that it would be more reckless not to experiment. It seemed reasonable to assume that — if they could be persuaded to take the internet seriously — their companies would dominate in this new world, as they had in the old world.

We were no different. After just four days it seemed blindingly obvious that the future of information would be mainly digital. Plain old words on paper — delivered expensively by essentially Victorian production and distribution methods — couldn’t, in the end, compete. The future would be more interactive, more image-driven, more immediate. That was clear. But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print? How could you convince anyone that this should be a priority when no one had yet worked out how to make any money out of it? The change, and therefore the threat, was likely to happen rapidly and maybe violently. How quickly could we make a start? Or was this something that would be done to us?

In a note for Peter Preston on our return I wrote, “The internet is fascinating, intoxicating . . . it is also crowded out with bores, nutters, fanatics and middle managers from Minnesota who want the world to see their home page and CV. It’s a cacophony, a jungle. There’s too much information out there. We’re all overloaded. You want someone you trust to fillet it, edit it and make sense of it for you. That’s what we do. It’s an opportunity.”

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry.

I spent the next year trying to learn more and then the calendar clicked on to 1995 — The Year the Future Began, at least according to a recent book by the cultural historian W. Joseph Campbell, who used the phrase as his book title twenty years later. It was the year Amazon.com, eBay, Craigslist, and Match.com established their presence online. Microsoft spent $300m launching Windows 95 with weeks of marketing hype, spending millions for the rights to the Rolling Stones hit “Start Me Up,” which became the anthem for the Windows 95 launch.

Cyberspace — as the cyber dystopian Evgeny Morozov recalled, looking back on that period — felt like space itself. “The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (‘Internet Explorer,’ ‘Netscape Navigator’).”

But, as Campbell was to reflect, “no industry in 1995 was as ill-prepared for the digital age, or more inclined to pooh-pooh the disruptive potential of the Internet and World Wide Web, than the news business.” It suffered from what he called “innovation blindness” — “an inability, or a disinclination to anticipate and understand the consequences of new media technology.”

1995 was, then, the year the future began. It happened also to be the year in which I became editor of the Guardian.

* * *

I was 41 and had not, until very recently, really imagined this turn of events. My journalism career took a traditional enough path. A few years reporting; four years writing a daily diary column; a stint as a feature writer — home and abroad. In 1986 I left the Guardian to be the Observer’s television critic. When I rejoined the Guardian I was diverted towards a route of editing — launching the paper’s Saturday magazine followed by a daily tabloid features section and moving to be deputy editor in 1993. Peter Preston — unshowy, grittily obstinate, brilliantly strategic — looked as if he would carry on editing for years to come. It was a complete surprise when he took me to the basement of the resolutely unfashionable Italian restaurant in Clerkenwell he favored, to tell me he had decided to call it a day.

On most papers the proprietor or chief executive would find an editor and take him or her out to lunch to do the deal. On the Guardian — at least according to tradition dating back to the mid-70s — the Scott Trust made the decision after balloting the staff, a process that involved manifestos, pub hustings, and even, by some candidates, a little frowned-on campaigning.

I supposed I should run for the job. My mission statement said I wanted to boost investigative reporting and get serious about digital. It was, I fear, a bit Utopian. I doubt much of it impressed the would-be electorate. British journalists are programmed to skepticism about idealistic statements concerning their trade. Nevertheless, I won the popular vote and was confirmed by the Scott Trust after an interview in which I failed to impress at least one Trustee with my sketchy knowledge of European politics. We all went off for a drink in the pub round the back of the office. A month later I was editing.

“Fleet Street,” as the UK press was collectively called, was having a torrid time, not least because the biggest beast in the jungle, Rupert Murdoch, had launched a prolonged price war that was playing havoc with the economics of publishing. His pockets were so deep he could afford to slash the price of The Times almost indefinitely — especially if it forced others out of business.

Reach before revenue — as it wasn’t known then.

The newest kid on the block, the Independent, was suffering the most. To their eyes, Murdoch was behaving in a predatory way. We calculated the Independent titles were losing around £42 million (nearly £80 million in today’s money). Murdoch’s Times, by contrast, had seen its sales rocket 80 per cent by cutting its cover prices to below what it cost to print and distribute. The circulation gains had come at a cost — about £38 million in lost sales revenue. But Murdoch’s TV business, BSkyB, was making booming profits and the Sun continued to throw off huge amounts of cash. He could be patient.

But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print.

The Telegraph had been hit hard — losing £45 million in circulation revenues through cutting the cover price by 18 pence. The end of the price war left it slowly clawing back lost momentum, but it was still £23 million adrift of where it had been the previous year. Murdoch — as so often — had done something bold and aggressive. Good for him, not so good for the rest of us. Everyone was tightening their belts in different ways. The Independent effectively gave up on Scotland. The Guardian saved a million a year in newsprint costs by shaving half an inch off the width of the paper.

The Guardian, by not getting into the price war, had “saved” around £37 million it would otherwise have lost. But its circulation had been dented by about 10,000 readers a day. Moreover, the average age of the Guardian reader was 43 — something that pre-occupied us rather a lot. We were in danger of having a readership too old for the job advertisements we carried.

Though the Guardian itself was profitable, the newspaper division was losing nearly £12 million (north of £21 million today). The losses were mainly due to the sister Sunday title, the Observer, which the Scott Trust had purchased as a defensive move against the Independent in 1993. The Sunday title had a distinguished history, but was hemorrhaging cash: £11 million losses.

Everything we had seen in America had to be put on hold for a while. The commercial side of the business never stopped reminding us that only three percent of households owned a PC and a modem.

* * *

But the digital germ was there. My love of gadgets had not extended to understanding how computers actually worked, so I commissioned a colleague to write a report telling me, in language I could understand, how our computers measured up against what the future would demand. The Atex system we had installed in 1987 gave everyone a dumb terminal on their desk — little more than a basic word processor. It couldn’t connect to the internet, though there was a rudimentary internal messaging system. There was no word count or spellchecker and storage space was limited. It could not be used with floppy disks or CD-ROMs. Within eight years of purchase it was already a dinosaur.

There was one internet connection in the newsroom, though most reporters were unaware of it. It was rumored that downstairs a bloke called Paul in IT had a Mac connected to the internet through a dial-up modem. Otherwise we were sealed off from the outside world.

Some of these journalist geeks began to invent Heath Robinson solutions to make the inadequate kit in Farringdon Road to do the things we wanted in order to produce a technology website online. Tom Standage — he later became deputy editor of the Economist, but then was a freelance tech writer — wrote some scripts to take articles out of Atex and format them into HTML so they could be moved onto the modest Mac web server — our first content management system, if you like. If too many people wanted to read this tech system at once the system crashed. So Standage and the site’s editor, Azeem Azhar, would take it in turns sitting in the server room in the basement of the building rebooting the machines by hand — unplugging them and physically moving the internet cables from one machine to another.

What would the future look like? We imagined personalized editions, even if we had not the faintest clue how to produce them. We guessed that readers might print off copies of the Guardian in their homes — and even toyed with the idea of buying every reader a printer. There were glimmers of financial hope. Our readers were spending £56 million a year buying the Guardian but we retained none of it: the money went on paper and distribution. In the back of our minds we ran calculations about how the economics of newspapers would change if we could save ourselves the £56 million a year “old world” cost.

By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future.

On top of editing, the legal entanglements sometimes felt like a full-time job on their own. Trying to engineer a digital future for the Guardian felt like a third job. There were somehow always more urgent issues. By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future. No plans in the new world lasted very long.

It was now apparent that we couldn’t get away with publishing selective parts of the Guardian online. Other newspapers had shot that fox by pushing out everything. We were learning about the connectedness of the web — and the IT team tentatively suggested that we might use some “offsite links” to other versions of the same story to save ourselves the need to write our own version of everything. This later became the mantra of the City University of New York (CUNY) digital guru Jeff Jarvis — “Do what you do best, and link to the rest.”

We began to grapple with numerous basic questions about the new waters into which we were gingerly dipping our toes.

Important question: Should we charge?

The Times and the Telegraph were both free online. A March 1996 memo from Bill Thompson, a developer who had joined the Guardian from Pipex, ruled it out:

I do not believe the UK internet community would pay to read an online edition of a UK newspaper. They may pay to look at an archive, but I would not support any attempt to make the Guardian a subscription service online . . . It would take us down a dangerous path.

In fact, I believe that the real value from an online edition will come from the increased contact it brings with our readers: online newspapers can track their readership in a way that print products never can, and the online reader can be a valuable commodity in their own right, even if they pay nothing for the privilege.

Thompson was prescient about how the overall digital economy would work — at least for players with infinitely larger scale and vastly more sophisticated technology.

What time of day should we publish?

The electronic Telegraph was published at 8 a.m. each day — mainly because of its print production methods. The Times, more automated, was available as soon as the presses started rolling. The Guardian started making some copy available from first edition through to the early hours. It would, we were advised, be fraught with difficulties to publish stories at the same time they were ready for the press.

Why were we doing it anyway?

Thompson saw the dangers of cannibalization, that readers would stop buying the paper if they could read it for free online. It could be seen as a form of marketing. His memo seemed ambivalent as to whether we should venture into this new world at all:

The Guardian excels in presenting information in an attractive easy to use and easy to navigate form. It is called a “broadsheet newspaper.” If we try to put the newspaper on-line (as the Times has done) then we will just end up using a new medium to do badly what an old medium does well. The key question is whether to make the Guardian a website, with all that entails in terms of production, links, structure, navigational aids etc. In summer 1995 we decided that we would not do this.

But was that still right a year later? By now we had the innovation team — PDU — still in the basement of one building in Farringdon Road, and another team in a Victorian loft building across the way in Ray Street. We were, at the margins, beginning to pick up some interesting fringe figures who knew something about computers, if not journalism. But none of this was yet pulling together into a coherent picture of what a digital Guardian might look like.

An 89-page business plan drawn up in October 1996 made it plain where the priorities lay: print.

We wanted to keep growing the Guardian circulation — aiming a modest increase to 415,000 by March 2000 — which would make us the ninth-biggest paper in the UK — with the Observer aiming for 560,000 with the aid of additional sections. A modest investment of £200,000 a year in digital was dwarfed by an additional £6 million cash injection into the Observer, spread over three years.

As for “on-line services” (we were still hyphenating it) we did want “a leading-edge presence” (whatever that meant), but essentially we thought we had to be there because we had to be there. By being there we would learn and innovate and — surely? — there were bound to be commercial opportunities along the road. It wasn’t clear what.

We decided we might usefully take broadcasting, rather than print, as a model — emulating its “immediacy, movement searchability and layering.”

If this sounded as if we were a bit at sea, we were. We hadn’t published much digitally to this point. We had taken half a dozen meaty issues — including parliamentary sleaze, and a feature on how we had continued to publish on the night our printing presses had been blown up by the IRA — and turned them into special reports.

It is a tribute to our commercial colleagues that they managed to pull in the thick end of half a million pounds to build these websites. Other companies’ marketing directors were presumably like ours — anxious about the youth market and keen for their brands to feel “cool.” In corporate Britain in 1996, there was nothing much cooler than the internet, even if not many people had it, knew where to find it or understood what to do with it.

* * *

The absence of a controlling owner meant we could run the Guardian in a slightly different way from some papers. Each day began with a morning conference open to anyone on the staff. In the old Farringdon Road office, it was held around two long narrow tables in the editor’s office — perhaps 30 or 40 people sitting or standing. When we moved to our new offices at Kings Place, near Kings Cross in North London, we created a room that was, at least theoretically, less hierarchical: a horseshoe of low yellow sofas with a further row of stools at the back. In this room would assemble a group of journalists, tech developers and some visitors from the commercial departments every morning at about 10 a.m. If it was a quiet news day we might expect 30 or so. On big news days, or with an invited guest, we could host anything up to 100.

A former Daily Mail journalist, attending his first morning conference, muttered to a colleague in the newsroom that it was like Start the Week — a Monday morning BBC radio discussion program. All talk and no instructions. In a way, he was right: It was difficult, in conventional financial or efficiency terms, to justify 50 to 60 employees stopping work to gather together each morning for anything between 25 and 50 minutes. No stories were written during this period, no content generated.

But something else happened at these daily gatherings. Ideas emerged and were kicked around. Commissioning editors would pounce on contributors and ask them to write the thing they’d just voiced. The editorial line of the paper was heavily influenced, and sometimes changed, by the arguments we had. The youngest member of staff would be in the same room as the oldest: They would be part of a common discussion around news. By a form of accretion and osmosis an idea of the Guardian was jointly nourished, shared, handed down, and crafted day by day.

You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was.

It led to a very strong culture. You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was. It could sometimes feel an intimidating meeting — even for, or especially for, the editor. The culture was intended to be one of challenge: If we’d made a wrong decision, or slipped up factually or tonally, someone would speak up and demand an answer. But challenge was different from blame: It was not a meeting for dressing downs or bollockings. If someone had made an error the previous day we’d have a post-mortem or unpleasant conversation outside the room. We’d encourage people to want to contribute to this forum, not make them fear disapproval or denunciation.

There was a downside to this. It could, and sometimes did, lead to a form of group-think. However herbivorous the culture we tried to nurture, I was conscious of some staff members who felt awkward about expressing views outside what we hoped was a  fairly broad consensus. But, more often, there would be a good discussion on two or three of the main issues of the day. We encouraged specialists or outside visitors to come in and discuss breaking stories. Leader writers could gauge the temperature of the paper before penning an editorial. And, from time to time, there would be the opposite of consensus: Individuals, factions, or groups would come and demand we change our line on Russia, bombing in Bosnia; intervention in Syria; Israel, blood sports, or the Labor leadership.

The point was this: that the Guardian was not one editor’s plaything or megaphone. It emerged from a common conversation — and was open to internal challenge when editorial staff felt uneasy about aspects of our journalism or culture.

* * *

Within two years — slightly uncomfortable at the power I had acquired as editor — I gave some away. I wanted to make correction a natural part of the journalistic process, not a bitterly contested post-publication battleground designed to be as difficult as possible.

We created a new role on the Guardian: a readers’ editor. He or she would be the first port of call for anyone wanting to complain about anything we did or wrote. The readers’ editor would have daily space in the paper — off-limits to the editor — to correct or clarify anything and would also have a weekly column to raise broader issues of concern. It was written into the job description that the editor could not interfere. And the readers’ editor was given the security that he/she could not be removed by the editor, only by the Scott Trust.

On most papers editors had sat in judgment on themselves. They commissioned pieces, edited and published them — and then were supposed neutrally to assess whether their coverage had, in fact, been truthful, fair, and accurate. An editor might ask a colleague — usually a managing editor — to handle a complaint, but he/she was in charge from beginning to end. It was an autocracy. That mattered even more in an age when some journalism was moving away from mere reportage and observation to something closer to advocacy or, in some cases, outright pursuit.

Allowing even a few inches of your own newspaper to be beyond your direct command meant that your own judgments, actions, ethical standards and editorial decisions could be held up to scrutiny beyond your control. That, over time, was bound to change your journalism. Sunlight is the best disinfectant: that was the journalist-as-hero story we told about what we do. So why wouldn’t a bit of sunlight be good for us, too?

The first readers’ editor was Ian Mayes, a former arts and obituaries editor then in his late 50s. We felt the first person in the role needed to have been a journalist — and one who would command instant respect from a newsroom which otherwise might be somewhat resistant to having their work publicly critiqued or rebutted. There were tensions and some resentment, but Ian’s experience, fairness and flashes of humor eventually won most people round.

One or two of his early corrections convinced staff and readers alike that he had a light touch about the fallibility of journalists:

In our interview with Sir Jack Hayward, the chairman of Wolverhampton Wanderers, page 20, Sport, yesterday, we mistakenly attributed to him the following comment: “Our team was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Sir Jack had just declined the offer of a hot drink. What he actually said was: “Our tea was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Profuse apologies.

In an article about the adverse health effects of certain kinds of clothing, pages 8 and 9, G2, August 5, we omitted a decimal point when quoting a doctor on the optimum temperature of testicles. They should be 2.2 degrees Celsius below core body temperature, not 22 degrees lower.

But in his columns he was capable of asking tough questions about our editorial decisions —  often prompted by readers who had been unsettled by something we had done. Why had we used a shocking picture which included a corpse? Were we careful enough in our language around mental health or disability? Why so much bad language in the Guardian? Were we balanced in our views of the Kosovo conflict? Why were Guardian journalists so innumerate? Were we right to link to controversial websites?

In most cases Mayes didn’t come down on one side or another. He would often take readers’ concerns to the journalist involved and question them — sometimes doggedly — about their reasoning. We learned more about our readers through these interactions; and we hoped that Mayes’s writings, candidly explaining the workings of a newsroom, helped readers better understand our thinking and processes.

It was, I felt, good for us to be challenged in this way. Mayes was invaluable in helping devise systems for the “proper” way to correct the record. A world in which — to coin a phrase —  you were “never wrong for long” posed the question of whether you went in for what Mayes termed “invisible mending.” Some news organizations would quietly amend whatever it was that they had published in error, no questions asked. Mayes felt differently: The act of publication was something on the record. If you wished to correct the record, the correction should be visible.

But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

We were some years off the advent of social media, in which any error was likely to be pounced on in a thousand hostile tweets. But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

I found liberation in having created this new role. There were few things editors can enjoy less than the furious early morning phone call or email from the irate subject of their journalism. Either the complainant is wrong — in which case there is time wasted in heated self-justification; or they’re right, wholly or partially. Immediately you’re into remorseful calculations about saving face. If readers knew we honestly and rapidly — even immediately — owned up to our mistakes they should, in theory, trust us more. That was the David Broder theory, and I bought it. Readers certainly made full use of the readers’ editor’s existence. Within five years Mayes was dealing with around 10,000 calls, emails, and letters a year — leading to around 1,200 corrections, big and small. It’s not, I think, that we were any more error-prone than other papers. But if you win a reputation for openness, you’d better be ready to take it as seriously as your readers will.

Our journalism became better. If, as a journalist, you know there are a million sleuth-eyed editors out there waiting to leap on your tiniest mistake, it makes you more careful. It changes the tone of your writing. Our readers often know more than we do. That became a mantra of the new world, coined by the blogger and academic Dan Gillmor, in his 2004 book We the Media8 but it was already becoming evident in the late 1990s.

The act of creating a readers’ editor felt like a profound recognition of the changing nature of what we were engaged in. Journalism was not an infallible method guaranteed to result in something we would proclaim as The Truth — but a more flawed, tentative, iterative and interactive way of getting towards something truthful.

Admitting that felt both revolutionary and releasing.

***

Excerpted from Breaking News: The Remaking of Journalism and Why It Matters Now by Alan Rusbridger. Published Farrar, Straus and Giroux November 27, 2018. Copyright © 2018 by Alan Rusbridger. All rights reserved.

Longreads Editor: Aaron Gilbreath