Search Results for: The American Scholar

American Sphinx

Illustration by Katie Kosma

Colin Dickey | Longreads | August 2017 | 14 minutes | 3380 words

We had come to a place muted of light. Every day felt like a potential backsliding, the news unrelenting, as though the nation had finally given up pushing back against its own savagery — and every day felt like the held breath before the fall. I thought increasingly of Stefan Lux, a Jewish journalist from Slovakia: Aghast at the rise of anti-Semitism during the 1930s, and at the inability of Europe’s bureaucratic governments to respond, Lux walked into the General Assembly of the League of Nations and, before the gathered diplomats, fatally shot himself. His last words were “C’est le dernier coup.” This is the final blow. It was only July 3, 1936; the blows would keep coming long after Lux’s death.

The center was not holding; there hadn’t been any center for decades. It was a country of bankrupt politicians, of killings by police so commonplace they barely made the news. It was a country in which families were routinely broken up by early morning immigration raids, where men abducted for traffic violations and women arrested for misdemeanors were sent off to countries they hadn’t known for decades. It was a nation where young white men found solace drifting through rage and irony, and felt alive only by terrorizing others. It was not a country in open revolution, but more and more its people felt revolution would at least be the exhalation they’d been waiting for. It was a country waiting for the final blow.

Whatever rough beast Yeats had seen had already slouched its way out of the desert, laying waste to everything that fell under its pitiless, blank gaze. The body of a lion and the head of a man, the indignant desert birds circling around its slow thighs, it has laid waste to the veneer of civility and decorum that had once been papered over the country.

Read more…

Are Arizona’s Defunded Public Schools the Future of American Education?

Trump’s secretary of education is expanding school voucher programs under the guise of providing greater “school choice” to parents, but as some historically underfunded public school systems show, further divestment spells disaster for public education and the hope for an educated public. At Harper’sAlexandria Neason spent time in Phoenix, Arizona to examine the effects of divestment. From lawsuits and budget cuts to unsafe buildings, Arizona’s struggling public schools suffer some of the country’s lowest teacher salaries and funding per pupil. Naturally, many teachers leave after five years, and the state’s teachers no longer just teach. They canvas door-to-door to ask citizens to help schools financially, and they use their own money to buy books and basic supplies that all public schools should have. And yet, against all evidence and logic, the secretary of education is advocating for a national voucher program. This isn’t the future Americans deserve.

When Governor Ducey signed the new E.S.A. bill into law, he did so in the absence of any studies evaluating its effectiveness. Across the country, there has been relatively little long-term research examining voucher programs, and the findings that do exist are at best mixed. In Milwaukee, a report found that while some voucher kids are more likely to graduate on to a four-year college, there is little to support the notion that, on the basis of test scores, they are better prepared. A recent study of Indiana’s program, which was expanded while Mike Pence was the governor, discovered that students saw drops in math scores, and did not improve in reading until they spent at least four years in private school. In April, the U.S. Department of Education released an analysis of the program in Washington, D.C., the nation’s only federally funded voucher system. The results were grim: Students who used vouchers earned markedly lower scores on math tests in their first year compared with those who applied but did not receive a voucher. Children in kindergarten through fifth grade also had lower reading scores. Secretary DeVos defended the program anyway, insisting that parents overwhelmingly support it.

The election of Trump, and DeVos’s confirmation, has effectively made school choice into national policy. The vouchers, education savings accounts, and tax-credit programs that already exist are poised to grow. This year, thirty-five states have introduced bills that would either create or expand school choice programs. On the federal level, DeVos’s education budget proposal includes $9.2 billion in cuts. (If implemented, it will gut teacher preparation and professional development, after-school programs, Special Olympics activities, American history, and the arts, among other things.) She will instead finance her school choice priorities, namely a $250 million increase in scholarships that send kids to private (including religious) schools and a $1 billion infusion to the Furthering Options for Children to Unlock Success (FOCUS) program, which sends money to districts that do away with zoning and adopt open enrollment — as Arizona does.

In June, DeVos’s camp received judicial validation. The U.S. Supreme Court ruled in favor of the Trinity Lutheran Church Child Learning Center, in Missouri, which sought public funds to build a playground. “We should all celebrate the fact that programs designed to help students will no longer be discriminated against by the government based solely on religious affiliation,” DeVos cheered. In the dissent, Justice Sotomayor, joined by Justice Ginsburg, wrote that the decision “slights both our precedents and our history, and its reasoning weakens this country’s longstanding commitment to a separation of church and state beneficial to both.”

Read the story

How a Great American Theatrical Family Produced the 19th Century’s Most Notorious Assassin

John Wilkes Booth, Edwin Booth and Junius Booth, Jr. (from left to right) in Shakespeare’s Julius Caesar in 1864. Photo via Wikimedia Commons

Nora Titone | My Thoughts Be Bloody: The Bitter Rivalry Between Edwin and John Wilkes Booth That Led to an American Tragedy | The Free Press | October 2010 | 41 minutes (11,244 words)

 

Below is an excerpt from the book My Thoughts Be Bloody, by Nora Titone, as recommended by Longreads contributor Dana Snitzky, who writes: 

“This is the story of the celebrated Booth family in the final year before John Wilkes made a mad leap into historical memory that outdid in magnitude every accomplishment of his father and brothers. When the curtain rises on this chapter of Nora Titone’s book, both Edwin and John Wilkes have already staged performances for President Lincoln at Ford’s Theater; by the time it comes down, one of them will be readying to assassinate him there.” 

Read more…

American Horror, Ivy League Edition

Longreads Pick

“Perhaps what Will Hunting says to a pompous Harvard scholar is really true: ‘You dropped a hundred and fifty grand on an education you coulda’ picked up for a dollar fifty in late charges at the public library.’ Except, of course, an Ivy League education has become even more obscenely expensive in the 17 years since Good Will Hunting romanticized Southie autodidactism.” An examination of three books criticizing the Ivy League.

Source: Newsweek
Published: Aug 8, 2014
Length: 14 minutes (3,715 words)

Why Do So Many People Pretend to Be Native American?

Illustration by Kjell Reigstad

Russell Cobb | This Land Press | August 2014 | 16 minutes (3,976 words)

This Land PressFor this week’s Longreads Member Pick, we are thrilled to share a brand new essay from Oklahoma’s This Land Press, just published in their August 2014 issue. This Land has been featured on Longreads often in the past—you can support them here.
Subscribe to This Land

Download .mobi (Kindle) Download .epub (iBooks)

* * *

Read more…

One Man’s Poison

Richard Baker/via Getty Images

Kyoko Mori | Apple, Tree: Writers on Their Parents | University of Nebraska Press | September 2019 | 19 minutes (3,670 words)

 

Before my mother’s suicide the year I turned twelve, my father and I seldom saw each other. An engineer who became a board director at a steel-manufacturing conglomerate, Hiroshi traveled all over the country on business. Even when he worked in his office in Kobe, he left early and came back — if he came back — past midnight. My mother waited up, but he often called from some noisy bar to claim he was leaving on a business trip. Other phone calls, from women looking for him, made clear that my father had several girlfriends who vied for his attention. I can’t remember a time when I didn’t know that he was a liar and a cheat and that women were attracted to him all the same.

Since his free time was devoted to playing rugby with former college teammates, Hiroshi seldom joined my mother, brother, and me on family vacations or outings. He did once attend a family reunion — for his side of the family — at a Chinese restaurant in downtown Kobe. My brother, Jumpei, four years younger than me, was still a toddler. When we got to the restaurant, our relatives hadn’t arrived yet, the banquet room wasn’t ready, and my mother had to take Jumpei to the bathroom. I was left to sit at the bar with Hiroshi while we waited. He must have had to help me up to the barstool, but I don’t remember him lifting me or holding me on that occasion or any other. What I do recall is the woman behind the bar placing a glass of soda pop in front of me, smiling in an exaggerated way, and saying, “You look just like your father. How lucky for you. He is so very handsome.”

Read more…

The Artificial Intelligence of the Public Intellectual

morkeman / Getty

Soraya Roberts | Longreads | May 2019 | 8 minutes (2,228 words)

“Well, that’s a really important thing to investigate.” While Naomi Wolf’s intellectual side failed her last week, her public side did not. That first line was her measured response when a BBC interviewer pointed out — on live radio — that cursory research had disproven a major thesis in her new book, Outrages: Sex, Censorship, and the Criminalization of Love (she misinterpreted a Victorian legal term, “death recorded,” to mean execution — the term actually meant the person was pardoned). Hearing this go down, journalists like me theorized how we would react in similar circumstances (defenestration) and decried the lack of fact-checkers in publishing (fact: Authors often have to pay for their own). The mistake did, however, ironically, offer one corrective: It turned Wolf from cerebral superhero into mere mortal. No longer was she an otherworldly intellect who could suddenly complete her Ph.D. — abandoned at Oxford when she was a Rhodes Scholar in the mid-’80s, Outrages is a reworking of her second, successful, attempt — while juggling columns for outlets like The Guardian, a speaking circuit, an institute for ethical leadership, and her own site, DailyClout, not to mention a new marriage. Something had to give, and it was the Victorians.

Once, the public intellectual had the deserved reputation of a scholarly individual who steered the public discourse: I always think of Oscar Wilde, the perfect dinner wit who could riff on any subject on command and always had the presence of mind to come up with an immortal line like, “One can survive everything nowadays except death.” The public intellectual now has no time for dinner. Wolf, for instance, parlayed the success of her 1991 book The Beauty Myth into an intellectual career that has spanned three decades, multiple books, and a couple of political advisory jobs, in which time her supposed expertise has spread far beyond third-wave feminism. She has become a symbol of intellectual rigor that spans everything from vaginas to dictatorships — a sort of lifestyle brand for the brain. Other thought leaders like her include Jordan Peterson, Fareed Zakaria, and Jill Abramson. Their minds have hijacked the public trust, each one acting as the pinnacle of intellect, an individual example of brilliance to cut through all the dullness, before sacrificing the very rigor that put them there in order to maintain the illusion floated by the media, by them, even by us. The public intellectual once meant public action, a voice from the outside shifting the inside, but then it became personal, populated by self-serving insiders. The public intellectual thus became an extension — rather than an indictment — of the American Dream, the idea that one person, on their own, can achieve anything, including being the smartest person in the room as well as the richest.

* * *

I accuse the Age of Enlightenment of being indirectly responsible for 12 Rules for Life. The increasingly literate population of the 18th century was primed to live up to the era’s ultimate aspiration: an increasingly informed public. This was a time of debates, public lectures, and publications and fame for the academics behind them. Ralph Waldo Emerson, for one. In his celebrated “The American Scholar” speech from 1837, Emerson provided a framework for an American cultural identity — distinct from Europe’s — which was composed of a multifaceted intellect (the One Man theory). “The scholar is that man who must take up into himself all the ability of the time, all the contributions of the past, all the hopes of the future,” he said. “In yourself slumbers the whole of Reason; it is for you to know all, it is for you to dare all.” While Emerson argued that the intellectual was bound to action, the “public intellectual” really arrived at the end of the 19th century, when French novelist Émile Zola publicly accused the French military of antisemitism over the Dreyfus Affair in an open letter published in  L’Aurore newspaper in 1898. With  “J’Accuse…!,” the social commentary Zola spread through his naturalist novels was transformed into a direct appeal to the public: Observational wisdom became intellectual action. “I have but one passion: to enlighten those who have been kept in the dark, in the name of humanity which has suffered so much and is entitled to happiness,” he wrote. “My fiery protest is simply the cry of my very soul.”

The public intellectual thenceforth became the individual who used scholarship for social justice. But only briefly. After the Second World War, universities opened up to serve those who had served America, which lead to a boost in educated citizens and a captive audience for philosophers and other scholars. By the end of the ’60s, television commanded our attention further with learned debates on The Dick Cavett Show — where autodidact James Baldwin famously dressed down Yale philosopher Paul Weiss — and Firing Line with William F. Buckley Jr. (also famously destroyed by Baldwin), which would go on to host academics like Camille Paglia in the ’90s. But Culture Trip editor Michael Barron dates the “splintering of televised American intellectualism” to a 1968 debate between Gore Vidal — “I want to make 200 million people change their minds,” the “writer-hero” once said — and Buckley, which devolved into playground insults. A decade later, the public intellectual reached its celebrity peak, with Susan Sontag introducing the branded brain in People magazine (“I’m a book junkie. … I buy special editions like other women shop for designer originals at Saks.”)

As television lost patience with Vidal’s verbose bravado, he was replaced with more telegenic — angrier, stupider, more right-wing — white men like Bill O’Reilly, who did not clarify nuance but blustered over the issues of the day; the public intellectual was now all public, no intellect. Which is to say, the celebrity pushed out the scholar, but it was on its way out anyway. By the ’80s, the communal philosophical and political conversations of the post-war era slunk back to the confines of academia, which became increasingly professionalized, specialized, and insular, producing experts with less general and public-facing knowledge. “Anyone who engages in public debate as a scholar is at risk of being labelled not a serious scholar, someone who is diverting their attention and resources away from research and publicly seeking personal aggrandizement,” one professor told University Affairs in 2014. “It discourages people from participating at a time when public issues are more complicated and ethically fraught, more requiring of diverse voices than ever before.” Diversity rarely got past the ivy, with the towering brilliance of trespassers like Baldwin and Zora Neale Hurston, among other marginalized writers, limited by their circumstances. “The white audience does not seek out black public intellectuals to challenge their worldview,” wrote Mychal Denzel Smith in Harper’s last year, “instead they are meant to serve as tour guides through a foreign experience that the white audience wishes to keep at a comfortable distance.”

Speaking of white audiences … here’s where I mention the intellectual dark web even though I would rather not. It’s the place — online, outside the academy, in pseudo-intellectual “free thought” mag Quillette — where reactionary “intellectuals” flash their advanced degrees while claiming their views are too edgy for the schools that graduated them. These are your Petersons, your Sam Harrises, your Ben Shapiros, the white (non)thinkers, usually men, tied in some vague way to academia, which they use to validate their anti-intellectualism while passing their feelings off as philosophy and, worse, as (mis)guides for the misguided. Last month, a hyped debate between psychology professor Peterson and philosopher Slavoj Žižek had the former spending his opening remarks stumbling around Marxism, having only just read The Communist Manifesto for the first time since high school. As Andray Domise wrote in Maclean’s, “The good professor hadn’t done his homework.” But neither have his fans.

But it’s not just the conservative public intellectuals who are slacking off. Earlier this year, Jill Abramson, the former executive editor of The New York Times, published Merchants of Truth: The Business of News and the Fight for Facts. She was the foremost mind on journalism in the Trump era for roughly two seconds before being accused of plagiarizing parts of her book. Her response revealed that the authorship wasn’t exactly hers alone, a fact which only came to light in order for her to blame others for her mistakes. “I did have fact-checking, I did have assistants in research, and in some cases, the drafting of parts of the book,” she told NPR. “I certainly did spend money. But maybe it wasn’t enough.” Abramson’s explanation implied a tradition in which, if you are smart enough to be rich enough, you can pay to uphold your intellectual reputation, no matter how artificial it may be.

That certainly wasn’t the first time a public intellectual overrepresented their abilities. CNN host Fareed Zakaria, a specialist in foreign policy with a Ph.D. from Harvard — a marker of intelligence that can almost stand in for actual acumen these days — has been accused multiple times of plagiarism, despite “stripping down” his extensive workload (books, speeches, columns, tweets). Yet he continues to host his own show and to write a column for The Washington Post in the midst of a growing number of unemployed journalists and dwindling number of outlets. Which is part of the problem. “What happens in the media is the cult of personality,” said Charles R. Eisendrath, director of the Livingston Awards and Knight-Wallace Fellowship, in the Times. “As long as it’s cheaper to brand individual personalities than to build staff and bolster their brand, they will do it.” Which is why Wolf, and even Abramson, are unlikely to be gone for good.

To be honest, we want them around. Media output hasn’t contracted along with the industry, so it’s easier to follow an individual than a sprawling media site, just like it’s easier to consult a YouTube beauty influencer than it is to browse an entire Sephora. With public intellectuals concealing the amount of work required of them, the pressure to live up to the myth we are all helping to maintain only increases, since the rest of us have given up on trying to keep pace with these superstars. They think better than we ever could, so why should we bother? Except that, like the human beings they are, they’re cutting corners and making errors and no longer have room to think the way they did when they first got noticed. It takes significant strength of character in this economy of nonstop (and precarious) work to bow out, but Ta-Nehisi Coates did when he stepped down last year from his columnist gig at The Atlantic, where he had worked long before he started writing books and comics. “I became the public face of the magazine in many ways and I don’t really want to be that,” he told The Washington Post. “I want to be a writer. I’m not a symbol of what The Atlantic wants to do or whatever.”

* * *

Of course a public intellectual saw this coming. In a 1968 discussion between Norman Mailer and Marshall McLuhan on identity in the technology age (which explains the rise in STEM-based public intellectuals), the latter said, “When you give people too much information, they resort to pattern recognition.” The individuals who have since become symbols of thought — from the right (Christina Hoff Sommers) to the left (Roxane Gay) — are overrepresented in the media, contravening the original definition of their role as outsiders who spur public action against the insiders. In a capitalist system that promotes branded individualism at the expense of collective action, the public intellectual becomes a myth of impossible aspiration that not even it can live up to, which is the point — to keep selling a dream that is easier to buy than to engage in reality. But an increasingly intelligent public is gaining ground.

The “Public Intellectual” entry in Urban Dictionary defines it as, “A professor who spends too much time on Twitter,” citing Peterson as an example. Ha? The entry is by OrinKerr, who may or may not be (I am leaning toward the former) a legal scholar who writes for the conservative Volokh Conspiracy blog. His bad joke is facetious, but not entirely inaccurate — there’s a shift afoot, from the traditional individual public intellectual toward a collective model. That includes online activists and writers like Mikki Kendall, who regularly leads discussions about feminism and race on Twitter; Bill McKibben, who cofounded 360.org, an online community of climate change activists; and YouTubers like Natalie Wynn, whose ContraPoints video essays respond to real questions from alt-right men. In both models, complex thought does not reside solely with the individual, but engages the community. This is a reversion to one of the early definitions of public intellectualism by philosopher Antonio Gramsci. “The traditional and vulgarized type of the intellectual is given by the man of letters, the philosopher, the artist,” he wrote in his Prison Notebooks — first published in 1971. “The mode of being of the new intellectual can no longer consist in eloquence, which is an exterior and momentary mover of feelings and passions, but in active participation in practical life, as constructor, organizer, ‘permanent persuader’ and not just a simple orator.” It doesn’t matter if you’re the smartest person in the room, as long as you can make it move.

* * *

Soraya Roberts is a culture columnist at Longreads.

The Anarchists Who Took the Commuter Train

A matchbook ad for Pennsylvania Railroad, 1940. Jim Heimann Collection / Getty.

Amanda Kolson Hurley | An excerpt from Radical Suburbs: Experimental Living on the Fringes of the American City | Belt Publishing | April 2019 | 19 minutes (4,987 words)

The Stelton colony in central New Jersey was founded in 1915. Humble cottages (some little more than shacks) and a smattering of public buildings ranged over a 140-acre tract of scrubland a few miles north of New Brunswick. Unlike America’s better-known  experimental settlements of the nineteenth century, rather than a refuge for a devout religious sect, Stelton was a hive of political radicals, where federal agents came snooping during the Red Scare of 1919-1920. But it was also a suburb, a community of people who moved out of the city for the sake of their children’s education and to enjoy a little land and peace. They were not even the first people to come to the area with the same idea: There was already a German socialist enclave nearby, called Fellowship Farm.

The founders of Stelton were anarchists. In the twenty-first century, the word “anarchism” evokes images of masked antifa facing off against neo-Nazis. What it meant in the early twentieth century was different, and not easily defined. The anarchist movement emerged in the mid-nineteenth century alongside Marxism, and the two were allied for a time before a decisive split in 1872. Anarchist leader Mikhail Bakunin rejected the authority of any state — even a worker-led state, as Marx envisioned — and therefore urged abstention from political engagement. Engels railed against this as a “swindle.”

But anarchism was less a coherent, unified ideology than a spectrum of overlapping beliefs, especially in the United States. Although some anarchists used violence to achieve their ends, like Leon Czolgosz, who assassinated President William McKinley in 1901, others opposed it. Many of the colonists at Stelton were influenced by the anarcho-pacifism of Leo Tolstoy and by the land-tax theory of Henry George. The most venerated hero was probably the Russian scientist-philosopher Peter Kropotkin, who argued that voluntary cooperation (“mutual aid”) was a fundamental drive of animals and humans, and opposed centralized government and state laws in favor of small, self-governing, voluntary associations such as communes and co-ops. Read more…

Orwell’s Last Neighborhood

Barnhill on the Isle of Jura, Scotland. (David Brown)

David Brown | The American Scholar | April 2019 | 23 minutes (5,796 words)

It’s hard to know what would be a good place from which to imagine a future of bad smells and no privacy, deceit and propaganda, poverty and torture. Does a writer need to live in misery and ugliness to conjure up a dystopia?

Apparently not.

We’d been walking more than an hour. The road was two tracks of pebbled dirt separated by a strip of grass. The land was treeless as prairie, with wildflowers and the seedless tops of last year’s grass smudging the new growth.

We rounded a curve and looked down a hillside to the sea. A half mile in the distance, far back from the water, was a white house with three dormer windows. Behind it, a stone wall cut a diagonal to the water like a seam stitching mismatched pieces of green velvet. Far to the right, a boat moved along the shore, its sail as bright as the house.

This was where George Orwell wrote Nineteen Eighty-Four. The house, called Barnhill, sits near the northern end of Jura, an island off Scotland’s west coast in the Inner Hebrides. It was June 2, sunny, short-sleeve warm, with the midges barely out, and couldn’t have been more beautiful.

Orwell lived here for parts of the last three years of his life. He left periodically (mostly in the winter) to do journalism in London and, for seven months in 1947 and 1948, to undergo treatment for pulmonary tuberculosis. Although he rented Barnhill and didn’t own it, he put in fruit trees and a garden, built a chicken house, bought a truck and a boat, and invested numberless hours of labor in what he believed would be his permanent home. When he left it for the last time, in January 1949, he never again lived outside a sanatorium or hospital.

I came to Jura after a two-week backpacking trip across Scotland. My purpose was to drink single-malt on Islay, the island to the south, and enjoy two nights of indulgence at Ardlussa House, where Orwell’s landlord had lived. I was not on a literary pilgrimage. Barnhill is not open to the public, and no one among the island’s 235 residents remembers Orwell. Read more…

The Ladies Who Were Famous for Wanting to Be Left Alone

Sarah Ponsonby and Lady Eleanor Butler In Their Library, engraving by Richard James Lane (Creative Commons)

 

Patricia Hampl | Excerpt adapted from The Art of the Wasted Day | Viking | April 2018 | 18 minutes (4,735 words)

 

On the night of Monday, March 30, 1778, an Anglo-Irish lady named Sarah Ponsonby, age twenty-three, the unmarried dependent of well-placed relatives (her parents long dead), slipped out of her guardians’ Georgian mansion in Woodstock, Kilkenny, the rest of the house asleep. She was dressed in men’s clothing, had a pistol on her, and carried her little dog, Frisk.

She made her way to the estate’s barn where Lady Eleanor Butler, a spinster sixteen years her senior, a member of one of the beleaguered old Catholic dynasties of Ireland (the Dukes — later the Earls — of Ormonde), was awaiting her, having decamped from stony Butler Castle twelve miles distant on a borrowed horse. She too was wearing men’s breeches and a topcoat.

Their plan, long schemed, was to ride through the night, the moon a bare sliver, to Waterford, twenty-three miles away on the coast, and from there to embark for England to live together somewhere (they had no exact destination) in “delicious seclusion.” Their goal was “Retirement,” a life of “Sentiment” and “Tenderness.”
Read more…