The Longreads Blog

Ida B. Wells-Barnett Was Born Today in 1862

Smithsonian National Portrait Gallery

Today marks the birthday of Ida B. Wells-Barnett — the educator and journalist who pioneered investigative reporting techniques still used today to uncover details of lynchings across the South. Wells-Barnett ran a newspaper, the Memphis Free Speech, and also helped found the NAACP and the National Association of Colored Women.

Wells was born enslaved on July 16, 1862, in Holly Springs, Mississippi. She moved to Chicago in 1894 and died there in 1931.

In its series “Overlooked,” launched last March, the New York Times ran obituaries of important figures whose deaths had previously gone unmentioned in the paper. Wells-Barnett’s was among the first wave of belated obituaries:

Wells was already a 30-year-old newspaper editor living in Memphis when she began her anti-lynching campaign, the work for which she is most famous. After [her friend, Thomas] Moss was killed, she set out on a reporting mission, crisscrossing the South over several months as she conducted eyewitness interviews and dug up records on dozens of similar cases.

Her goal was to question a stereotype that was often used to justify lynchings — that black men were rapists. Instead, she found that in two-thirds of mob murders, rape was never an accusation. And she often found evidence of what had actually been a consensual interracial relationship.

She published her findings in a series of fiery editorials in the newspaper she co-owned and edited, The Memphis Free Speech and Headlight. The public, it turned out, was starved for her stories and devoured them voraciously. The Journalist, a mainstream trade publication that covered the media, named her “The Princess of the Press.”

Readers of her work were drawn in by her fine-tooth reporting methods and language that, even by today’s standards, was aberrantly bold.

“There has been no word equal to it in convincing power,” Frederick Douglass wrote to her in a letter that hatched their friendship. “I have spoken, but my word is feeble in comparison,” he added.

He was referring to writing like the kind that she published in The Free Speech in May 1892.

Wells-Barnett’s great-granddaughter Michelle Duster is organizing and fundraising for a monument to the journalist to be built in Chicago.

Read the obituary

 

The Far Right’s Fight Against Race-Conscious School Admissions

WASHINGTON, DC - OCTOBER 10: Attorney Bert Rein (L), speaks to the media while standing with plaintiff Abigail Noel Fisher (R), after the U.S. Supreme Court heard arguments in her caseon October 10, 2012 in Washington, DC. The high court heard oral arguments on Fisher V. University of Texas at Austin and are tasked with ruling on whether the university's consideration of race in admissions is constitutional. (Photo by Mark Wilson/Getty Images)

Late in the afternoon on July 3, the Department of Justice announced it was rescinding 24 documents issued by the Obama administration between 2011 and 2016. The documents  offered guidance to a range of constituencies, including homeowners, law enforcement, and employers. Some detailed employment protections for refugees and asylees; seven of the 24 discussed policies and Supreme Court rulings on race-conscious admissions practices in elementary, secondary, and post-secondary schools. In its statement, the DOJ called the guides “unnecessary, outdated, inconsistent with existing law, or otherwise improper.”

No immediate policy change will come from the documents’ removal. It’s more of a signal, a gesture in a direction, a statement about ideology. The Trump administration has already enacted several hard-line positions on immigration. And the Sessions-backed Justice Department has made a habit of signaling, by way of gesture, its opposition to affirmative action, and its belief that race-conscious policies, specifically, often amount to acts of discrimination.

***

The term “affirmative action” is ambiguous and has never been strictly defined. It’s a collection of notions, gestures, and ideas that existed before its present-day association with race. According to Smithsonian, the term was likely first used in the Depression-era Wagner Act. This legislation aimed to end harmful labor practices and encourage collective bargaining. It also mandated that employers found in violation “take affirmative action including reinstatement of employees with or without backpay” to prevent the continuation of harmful practices. The reinstatement and payment of dismissed employees were affirmative gestures that could be taken to right a wrong.

Nearly a decade later, in 1941, under pressure from organizer A. Philip Randolph, President Franklin D. Roosevelt issued Executive Order 8802 to prohibit race-based discrimination in the defense industries during the buildup to WWII. It is considered the first federal action to oppose racial discrimination since Reconstruction, and paved the way for President John F. Kennedy, who was the first to use “affirmative action” in association with race in Executive Order 10925. Kennedy’s order instructed government contractors to take “affirmative action to ensure that applicants are employed,” regardless of “race, creed, color, or national origin.” President Lyndon B. Johnson expanded the scope of Kennedy’s order to add religion when he issued Executive Order 11246 in 1965. Two years later, Johnson amended his own document to include sex on the list of protected attributes.

It was Republican president Richard Nixon who expanded the use of affirmative actions to ensure equal employment in all facets of government in 1969, when he issued Executive Order 11478. Nixon ran for office in 1968 on “law and order” and “tough on crime” messaging. He believed what he called “black capitalism” –- the idea of thriving black communities with high rates of employment and entrepreneurship — would ease the agitations of civil rights groups and end urban unrest. At the time, Nixon’s rhetoric won the support of a smattering of black cultural figures such as James Brown. “Black capitalism” was little more than a co-optation of some of the tenets of Black Power, which itself had come from a long-established line of conservative black political thought that emphasized economic empowerment and independence, self-determination and personal responsibility. In his version, Nixon envisioned only a slight role for the federal government; without the push of significant government investment, the policies and programs he created didn’t result in sweeping change. Still, shadows of Nixon’s thinking on black economics endured: They’re present in multiple speeches Obama made to black audiences during his presidency; Jay Z’s raps about the transformative, generational effects of his wealth; Kanye West’s TMZ and Twitter rants. Also, the backlash Nixon faced is remarkably similar in tone and content to today’s challenges to affirmative action, which typically involve a white person’s complaints about the incremental gains made by members of a previously disadvantaged group:

In 1969 Section 8(a) of the Small Business Act authorized the SBA to manage a program to coordinate government agencies in allocating a certain number of contracts to minority small businesses—referred to as procurements or contract “set-asides.” Daniel Moynihan, author of the controversial Moynihan Report, helped shape the program. By 1971 the SBA had allocated $66 million in federal contracts to minority firms, making it the most robust federal aid to minority businesses. Still, the total contracts given to minority firms amounted to only .1 percent of the $76 billion in total federal government contracts that year.

Yet even these miniscule minority set-asides immediately faced backlash from blue-collar workers, white construction firms, and conservatives, who called them “preferential treatment” for minorities. Ironically, multiple studies revealed that 20 percent of these already meager set-asides ended up going to white-owned firms.

***

A sense of lost advantage and power seems to animate both historical and recent challenges to race-based policies and practices. In Regents of University of California v. Bakke (1978) the first affirmative action case the Supreme Court ruled on, Allan Bakke, a white University of California at Davis medical school applicant, sued the school after being twice denied admission. The school had created a system to set aside a certain number of spaces for students from marginalized groups. The Court decided practices that relied on quota systems were unconstitutional, but it upheld the use of race in admissions decisions as long as it was among a host of other factors. Rulings in subsequent cases, such as Grutter v. Bollinger (2003) and most recently, Fisher v. University of Texas (2016) supported the use of race in admissions and reiterated the federal government’s interest in the diversity of the nation’s institutions. In the most-recent case, now-retired Justice Anthony Kennedy provided the Court’s swing vote.

Plaintiffs in affirmative action challenges tend to argue race-conscious admissions policies violate rights granted by the Fourteenth Amendment, especially its clause guaranteeing “equal protection of the laws.” Ratified 150 years ago last week, the Fourteenth Amendment established birthright citizenship and defined citizenship’s parameters. Its ideas originated in the years leading up to Reconstruction, during “colored conventions” held among African American leaders and activists,  and form the underpinnings of Brown v. Board Education (1954) and some provisions of the Civil Rights Act of 1964.

One of the most prominent opponents of affirmative action, Edward Blum, a fellow at the American Enterprise Institute, actively seeks and recruits aggrieved plaintiffs and attorneys to challenge race-based policies in school admissions and voting practices. Blum was the force behind the complaint of Abagail Fisher, the white student at the center of Fisher v. University of Texas. According to the New York Times:

In the Texas affirmative action case, he told a friend that he was looking for a white applicant to the University of Texas at Austin, his own alma mater, to challenge its admissions criteria. The friend passed the word to his daughter, Abigail Fisher. About six months later, the university rejected Ms. Fisher’s application.

“I immediately said, ‘Hey, can we call Edward?’” she recalled in an interview.

The case went to the Supreme Court twice, and though Ms. Fisher was portrayed as a less than stellar student, vilified as supporting a racist agenda, and ultimately lost, she said she still believed in Mr. Blum. “I think we started a conversation,” she said. “Edward obviously is not going to just lie down and play dead.”

Blum’s first lawsuit came about after he lost a Congressional election in Houston because, he felt, the boundaries of his district were drawn solely along racial lines. He is now behind lawsuits against Harvard University and the University of North Carolina at Chapel Hill, which allege the schools’ admissions policies discriminate against Asian American applicants. It is interesting and bold to use white women and Asian American students to dismantle programs meant to address America’s legacy of discrimination. Both groups have benefited significantly from Reconstruction and Civil Rights-era policies and legislation. Do Blum, Sessions, and their supporters believe race-based policies are irrelevant, illegal, or improper because for many, they’ve worked? I sense something more nefarious at play, such as a mounting sense of loss and growing resentment that the demographic shifts in our country also mean inevitable shifts in who holds power.

The Sessions-helmed Justice Department’s signals and the nomination of Judge Brett Kavanaugh to the high court, have, I’m sure, heartened activists like Blum. For the Nation, Eric Foner wrote about how the Fourteenth amendment’s ambiguity is what allows it to be used in a way that is so at odds with the spirit of its origins. It is that ambiguity, he says, that will allow, someday, in a different political climate, for another era of correction.

Sources and further reading:

The Castration Heard Around the World

AP Photo/File

In 1993, Lorena Bobbitt cut off her husband John Wayne Bobbit’s penis and threw it out her car window. After police retrieved the penis, surgeons stitched it back on, though some argue that the cops should have left it on the ground by the 7-11. Lorena claimed John raped and repeatedly abused her. John claimed Lorena was lying and married him for a green card. The jury found neither of them guilty of sexual assault or malicious wounding. By then, the couple had already entered the court of public opinion, which is where their story continues to live twenty-five years later.

At Vanity Fair, Lili Anolik revisits this enduring story, retelling it for a new generation, and examining its relevance in 2018, a time of #MeToo, a sexist president and ongoing assaults to women’s rights.

Legally, the case was a draw. By acquitting both John and Lorena, the judicial system was basically throwing up its hands, admitting it didn’t know who to blame. The public, however, was neither so confused nor so equivocal. Complexity and ambiguity be damned. They wanted a villain—John, an under-employed former Marine barfly with barbells for brains. And a heroine—Lorena, a young woman tipping the scales at 92 pounds who could hardly speak except to weep. This wasn’t life, it was TV. In fact, it was reality TV, or would have been were such a term yet coined.

The case was emblematic of the times: In the early 90s, the gender wars were especially bloody, casualties running high on both sides. Thelma & Louise, the inciting incident of which was a thwarted rape, was the big movie of 1991. That same year, Anita Hill testified about Coke cans and pubic hairs at the confirmation hearing of U.S. Supreme Court nominee Clarence Thomas. Camille Paglia declared Lorena’s deed a “revolutionary act.” Feminists supporting Lorena flashed the V-for-victory sign, then turned it on its side so it became a pair of scissors: snip snip.

Read the story

Welcome to the Jungle

Caitlin Moran, photo by chrisdonia via Flickr (CC BY-NC-SA 2.0)

In The Cut, Caitlin Moran tell us what it was like to be 18, newly in the big city (in this case, London), interested in sex, and with zero experience of men. Unsurprisingly, she learns quickly that there are a not-insignificant number of men whose “actual desire was just to be unpleasant to a woman somewhere private.”

Every woman I know has had a man like this; they’re a tollbooth you must pass through into true adulthood. The Classic Bad Man is a rite of passage. He should not have to be — it is not to womankind’s betterment that we learn to survive these things — but he is. And what I have observed is this: There are some men who simply desire to see unease and fear in a woman’s face. It is as if they get high off it. They huff it like cocaine. This is their addiction: making women scared. And they will spend their whole lives doing it. Do you know someone like this? I bet you do.

To be sure, there are also not-Bad Men, but interactions with them are frequently troubling as well — they’re working from a different cultural playbook:

This category of bad sexual experiences comes down to the fact that, at this point in history, men’s tabula for women is completely rasa, too. Every problem I had as a teenage girl, noncriminal men also have. There are no manuals about being a man who wishes to have swashbuckling sex adventures with his peers. There are no templates for how to approach a woman in a jolly and uplifting manner, discover her sexual preferences, get feedback while you’re rolling around naked, and learn from her without feeling oddly, horribly emasculated.

While my knowledge about the opposite sex came from MGM musicals and 19th-century literature, men’s tends to come from pornography and best-selling books by pickup artists. Men are working on the assumption they must either look like Burt Reynolds and bum a woman across a landing or else psychologically manipulate women into doing things they wouldn’t normally do, because sex is about, somehow, winning, rather than a collaboration between two people who delight in each other.

Read the essay

The Blue Ridge Country King

AP Photo/Steve Helber

John Lingan | Homeplace: A Southern Town, A Country Legend, and The Last Days of a Mountaintop Honky-Tonk| Houghton Mifflin Harcourt | July 2018 | 21 minutes (5,796 words)

Sure, there’s a quick way to the Troubadour Bar & Lounge: starting from Berkeley Springs, West Virginia, population 624, you simply turn onto Route 38/3, Johnson’s Mill Road, and head up into the Blue Ridge. Swing along pendulous mountain curves that ease past wide grass fields, up through dense tunnels of pin oak and pine. Take it slow at the one-lane wooden bridge and again at the hairpin turn by the vaunting power-line interchange. Past the cemetery, with its green-tinted graves so old that the names and dates are just half-disappeared scars. Then on through the final hypnotic stretch of forest, still on a roller-coaster incline that demands another inch down on the gas just as you might be compelled to slow up and address your lord.

Again, that’s the quick way, only 20 or so minutes of alert mountain driving. But if you aren’t coming from Berkeley Springs — if you’re coming from Capon Bridge, Gerrardstown, Hedgesville, Paw Paw, or any of the dozens of other panhandle towns too small for maps — then it’s even longer. Then it’s all woods, up and down hills with no visible end, past spray-painted houses made of plywood and exposed Tyvek. Look out for smeared snakes and exploded deer, and prepare for shaky trips across metal bridges high above the Potomac’s minor branches. Down below, to the boys swimming in T-shirts and waterproof shoes, your car’s faraway rumble might as well be distant thunder. Read more…

Losing the Middle Ground

Bettmann / Getty

Middle children, your worst nightmare may be coming true: you really are fading into the background. In a cruel self-fulfilling twist on the Middle Child Syndrome, statistics show that a family’s ideal size has shrunk to two kids, leaving the middleborns to go the way of the mastadon. More than just numbers, those in the middle often exhibit strong traits of empathy, diplomacy, and liberalism. Adam Sternbergh of The Cut asks: what are we giving up as a culture if we lose the Jan Bradys?

According to a study by the Pew Research Center in 1976, “the average mother at the end of her childbearing years had given birth to more than three children.” Read that again: In the ’70s, four kids (or more) was the most common family unit. Back then, 40 percent of mothers between 40 and 44 had four or more children. Twenty-five percent had three kids; 24 percent had two; and 11 percent had one.

For Hopman, middle children are primarily distinguished by an inexhaustible need for attention, a description from which he does not exempt himself. “Britney Spears: middle child,” he points out. “Kesha: middle child. Nicki Minaj: middle child. Also a middle child: Don King.”

“Middle children are evaporating from life, and that isn’t good for all of us,” says Kevin Leman, who literally wrote the book on the subject, his 1985 The Birth Order Book: Why You Are the Way You Are, which has sold over a million copies. “Middle children are like the peanut butter and jelly in the sandwich,” he explains. As for the coming extinction event, he says, “If you like a sandwich with nothing on it, enjoy.”

Birth order also appears to play a part in the decisions of Supreme Court justices. A 2015 paper in Law & Society Review found that, among the 55 justices who served from 1900 to 2010, oldest and only children showed a strong tendency toward conservative ideology, while middle and youngest children favored liberal decisions.

Read the story

The Top 5 Longreads of the Week

HSBC
(Keith Tsuji/Getty Images)

This week, we’re sharing stories from David Dayen, M.H. Miller, T. Cooper, Caren Lissner, and Michael Adno.

Sign up to receive this list free every Friday in your inbox. Read more…

Accountability for the Algorithms

Sir Tim Berners-Lee. Photo by Steve Parsons/PA Wire URN:14461971

The World Wide Web is about to reach an amazing and terrible milestone: soon, 50 percent of the world’s population will be online. For Vanity Fair, Katrina Brooker reports on how Tim Berners-Lee, reflecting on how corporations like Facebook and Google have misused his creation to manipulate and spy on users, is attempting to revive the original promise of an open and safe web for all. He’s building a new platform called Solid, to give users privacy and control over their information.

Berners-Lee, who never directly profited off his invention, has also spent most of his life trying to guard it. While Silicon Valley started ride-share apps and social-media networks without profoundly considering the consequences, Berners-Lee has spent the past three decades thinking about little else. From the beginning, in fact, Berners-Lee understood how the epic power of the Web would radically transform governments, businesses, societies. He also envisioned that his invention could, in the wrong hands, become a destroyer of worlds, as Robert Oppenheimer once infamously observed of his own creation. His prophecy came to life, most recently, when revelations emerged that Russian hackers interfered with the 2016 presidential election, or when Facebook admitted it exposed data on more than 80 million users to a political research firm, Cambridge Analytica, which worked for Donald Trump’s campaign. This episode was the latest in an increasingly chilling narrative. In 2012, Facebook conducted secret psychological experiments on nearly 700,000 users. Both Google and Amazon have filed patent applications for devices designed to listen for mood shifts and emotions in the human voice.

For the man who set all this in motion, the mushroom cloud was unfolding before his very eyes. “I was devastated,” Berners-Lee told me that morning in Washington, blocks from the White House. For a brief moment, as he recalled his reaction to the Web’s recent abuses, Berners-Lee quieted; he was virtually sorrowful. “Actually, physically—my mind and body were in a different state.” Then he went on to recount, at a staccato pace, and in elliptical passages, the pain in watching his creation so distorted.

The forces that Berners-Lee unleashed nearly three decades ago are accelerating, moving in ways no one can fully predict. And now, as half the world joins the Web, we are at a societal inflection point: Are we headed toward an Orwellian future where a handful of corporations monitor and control our lives? Or are we on the verge of creating a better version of society online, one where the free flow of ideas and information helps cure disease, expose corruption, reverse injustices?

For now, the Solid technology is still new and not ready for the masses. But the vision, if it works, could radically change the existing power dynamics of the Web. The system aims to give users a platform by which they can control access to the data and content they generate on the Web. This way, users can choose how that data gets used rather than, say, Facebook and Google doing with it as they please. Solid’s code and technology is open to all—anyone with access to the Internet can come into its chat room and start coding.

Read the story

Great Reviews Of Movies I Have Never Seen: A Reading List

In a movie theater, rows of red velvet chairs sitting empty in front of a blank screen.
Image by hashi photo via Wikimedia Commons (CC BY 3.0)

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories (and some of her friends’ favorites, too). 

I will admit upfront that I haven’t seen as many films as I feel I should. I’ve written one, an adaptation of my third book, DC Trip. It was scary to contemplate: I thought, “I haven’t seen enough films to write a film.” And then someone pointed out that the average male aspiring screenwriter would never let that stop him, and I figured this was correct.

I realized — and this is applicable for any job, really — I shouldn’t negotiate from a place of “I’m so lucky anyone would consider me for such a gig.” I should negotiate from a place of “Hell yeah, I can knock this out of the park and I deserve this gig! I will learn what I need to learn, ask questions, do the work, and figure it out as I go along. And I will do a very good job.” And I started watching more films, because while you learn a lot by doing, you also learn a lot by watching. Plus, if you want to do something for a living, it’s only respectful to your art form of choice to, you know, actually study it.

Conveniently enough, I also recently got sober, which means I’ve got more time on my hands now that I don’t spend one to two days a week functioning at the intellectual level of a toaster oven. Did you know that if you replace alcohol with water, you’ll sleep better at night and have a superior command of syntax in the morning? True facts, my friends. You’ll also have to deal with a bunch of stuff you were ignoring, like credit card debt and emotional scars, but you can escape that temporarily at your local movieplex!

Read more…

Tennis vs. Tennis

Andrea Petkovic and Tennis.

Text and Polaroids by Andrea Petkovic

Racquet and Longreads | July 2018 | 16 minutes (4,000 words)

This story is produced in partnership with Racquet magazine.

ALBUQUERQUE, N.M. — When I exit the plane in Albuquerque, the first thing I see is space. So much space and so few people. I’ve come from New York, and the minute I step onto New Mexican soil everything feels like it’s in slow motion. I speak slower, my steps are grander, my breath deeper. The desert landscape is a stark contrast to the crowds that I have become accustomed to in the city, and the landscape resembles nothing we have at home in Germany.

I’m on my way to Sister Bar, where Tennis will be playing. Tennis, in this instance, is a band, and I will be touring New Mexico, Arizona, and California with them. In a bus. I am wearing a wide-brimmed black hat and a faux leopard fur coat, despite the 90-degree heat. Perhaps I’ve overthought my Rock Tour Ensemble because I’m feeling uncharacteristically self-aware about being thrown into this alternate reality. In my real life, I am a tennis player. Full-time. How should I know what’s cool these days?

We will be traveling in a bus, from venue to venue, waking up early, seeking out breakfast burritos, eating too many, sitting on the bus, driving through the desert, six hours, seven hours, arriving at the theater. Cities and states and landscapes become one, unloading the gear, sound-checking, eating dinner, waiting for the show, the show, THE SHOW, the adrenaline-fueled banter after the concert, one beer, two beers, whiskey then vocal rest for Alaina, the lead singer, too little sleep, too little time for basic hygiene but it’s okay because the others have forsaken theirs too, then waking up early and doing it all again.

I’ve decided to do this because I have a hunger for throwing myself into the art world, the music world, the TV and movie world. I’m obsessed with contemporary culture in the widest sense. Are we tennis players part of it? Does experiencing an extraordinary intensity of emotion in your day-to-day job place you outside of conventional reality? And if it does, why do I try to understand it, why can’t I just accept it as it is? That’s why I’m here. Read more…