Search Results for: The Baffler

Why the sudden proliferation of “vibrant” communities in the United States? And what does it even mean?

Is Rockford, Illinois, vibrant? Oh my god yes: according to a local news outlet, the city’s ‘Mayor’s Arts Award nominees make Rockford vibrant.’ The Quad Cities? Check: As their tourism website explains, the four hamlets are ‘a vibrant community of cities sharing the Mississippi River in both Iowa and Illinois.’ Pittsburgh, Pennsylvania? Need you even ask? Pittsburgh is a sort of Athens of the vibrant; a city where dance parties and rock concerts enjoy the vigorous boosting of an outfit called ‘Vibrant Pittsburgh’; a place that draws young people from across the nation to frolic in its ‘numerous hip and vibrant neighborhoods,’ according to a blog maintained by a consortium of Pittsburgh business organizations.

The vibrations are just as stimulating in the component parts of this exciting new civilization. The people of creative-land use vibrant apps to check their bank accounts, chew on vegetarian ‘vibrancy bars,’ talk to one another on vibrant cellphones, and drive around in cars painted ‘vibrant white.’

“Dead End on Shakin’ Street.” — Thomas Frank, The Baffler

Featured Longreader: Jesse Farrell, a semi-expat American yogi. See his story picks from The Nation, The Guardian, The New York Review of Books, The Baffler, AlterNet, plus more on his #longreads page.

The Beauty of “Bl-Bl-Bl-Blue Moon”

Getty Images

Barry Yeoman, a man with a lifelong stutter, suggests that while society mostly views a stutter as a disability, stammering really isn’t the problem at all. At the Baffler, he argues that the real problem to cure is the assumption that those who stutter are somehow deficient.

Like virtually all disabilities, stuttering has long been viewed through a medical lens—as a pathology in search of neutralization, an obstacle to a successful life. That lens is embedded in the language of speech impediments and speech pathologists. At best, stuttering has been framed as a “despite” condition: we can be happy and productive despite how we talk.

Some of us, though, have been trying to flip the paradigm, to reframe stuttering as a trait that confers transformative powers. We wear our vulnerability on the outside, and that invites emotional intimacy with others. We slow down conversations, fostering patience. We give texture to language. We gauge character by our listeners’ reactions. We are good listeners ourselves.

“There’s something interesting about stuttering in a world that moves at increasingly breakneck speed,” says St. Pierre, the Alberta professor. For most of human history, we measured time in lunar cycles, menstrual cycles, agricultural cycles. Today we rely on “clock time,” standardized and designed for industrial production. Clock time values efficiency; it has no patience for silences and repeated syllables. “Stuttering highlights that fact: that clock time runs roughshod over all these other ways of creating time, but that they still persist and are still important,” he says. “Stuttering interrupts this hegemonic order of time.”

Alpern wrote an essay for Stammering Pride and Prejudice, an anthology published this year in the United Kingdom. (The British use “stammering” as a synonym for “stuttering.”) St. Pierre has a chapter; so does Constantino, who is one of the book’s editors. In hers, Alpern tells the story of ordering a “Bl-Bl-Bl-Blue Moon” at a bar and finding unexpected pleasure in the extra syllables. Part of the delight is in using a voice that is uniquely hers; part is the hard-earned absence of shame.

Part is physical: “that little loss of control that resolves itself so beautifully sometimes,” she writes. “I am falling through the air for an instant, then catching the ground again, like Fred Astaire pretending to trip when he dances.”

Read the story

If I Made $4 a Word, This Article Would Be Worth $10,000

Illustration by Homestead

Soraya Roberts | Longreads | June 2019 |  10 minutes ( 2,574 words)

What in the actual fuck. I thought journalists, even just culture journalists, were supposed to be brave. I thought they were supposed to risk their lives, even just psychologically. I thought they were supposed to shout and swear and beat their breasts — fuck everything else. At the very least I thought they were supposed to tell the truth. If any of that’s true, I don’t know what the hell all the people around me are doing. All the people who, I’ve been told again and again, don’t want to bite the hand that feeds, even though the food is shit and the hand is an asshole. I’m ashamed that I was tricked into believing they were better than so many of the people they report on, that their conspicuous support for unions and an industry full of undervalued workers was anything more than a performance. I didn’t think journalists, even just culture journalists, were supposed to be cowards. 

***

If you don’t know who Taffy Brodesser-Akner is, you are very likely not on Media Twitter and I salute you. At one point, Brodesser-Akner was invariably described as one of the busiest freelancers in America and you really did see her byline everywhere. Five years ago, she found her niche writing celebrity profiles for GQ and The New York Times, for which she won three New York Press Club awards. Journalists adore her not only for her prowess at cutting down the various gods we love and hate in equal measure, but also for her ability to lure the reader into being her coconspirator by nimbly threading herself through each story. Because of that, and because of the reach of the publications themselves, and — perhaps most importantly — because of her popularity among her peers, her articles almost always go viral. In 2017, Brodesser-Akner became a staff writer at the Times and this month she is promoting her first novel, Fleishman Is in Trouble

On June 14, Cosmopolitan published one of roughly 5 million interviews with the debut novelist, this one by Jen Ortiz. I was scrolling through Twitter on a break from writing back-to-back columns and noticed the usual gushing posts by journalists with blue checkmarks next to their names. Those tweets are no real indication the person has actually read the interview they’re sharing, but whatever, because, like, it’s Taffy, you know her! Who doesn’t stan her!?! It’s funny, if you search the article URL in Twitter, initially it’s just tweet after tweet of outsize praise — “I loved this profile of the master profiler” — then, like a sudden stop sign on a 90 mph expressway, there it is: “what in the actual fuck.” That one’s mine.

I’d read the article. I’d seen one of those first tweets and, like I always do, I’d read it for the holy grail every author is looking for: the secret to writing a successful book without wanting to papercut yourself to death with it. “I’m actually the second writer Cosmo has sent,” Ortiz noted, but for some reason her employer still made the mistake of sending someone who had worked with the subject at GQ. Or maybe that’s not a mistake. I don’t actually read Cosmo, and I suppose I should have before I announced with bravado the death of the puff piece last May. Either way, there I was, reading merrily along, then suddenly, like that tweet, I stopped. It was just a line, a line in a small, kind of out-of-place paragraph: “When I started doing the ‘I don’t get out of bed for less than $4 a word’ thing, people started paying me $4 a word.” What in the actual fuck. 

This is what it meant when I posted that quote and those words: It meant, what in the actual fuck.

It meant what fucking other freelancers in the world are making $4 a word right now. It meant what fucking magazines in the world are paying $4 a word right now. It meant what fucking lies is this industry telling us when so many people — people in actual war zones — only dream of making 50¢ a word. It meant in what fucking world can a freelancer treat $4 a word like it’s not near-impossible for the rest of us. The meaning was so obvious that I honestly didn’t think anyone would even notice the message. But they did. And they mistook it for something I didn’t mean at all: “Fuck Taffy.”

The reaction was swift and violent, and, from what I could tell, divided into those who could read (predominantly marginalized writers) and those who could not (predominantly nonmarginalized writers). My point was being illustrated in real time by the journalism industry’s 1 Percent, the mostly white legacy media reflexively rallying around one of their own — T!A!F!F!Y!! Their aggressive cheers distracting from the faceless, nameless collection of freelance writers who were not there to fight, but to have a conversation about parity — about equity — the way the original tweet was intended. These were the freelancers who, like me, had worked their asses off for years and watched disconcertingly as the better their work got, the less it seemed to get them. Unable to make a living, a number of them quit. (Blame Longreads for my recalcitrance.) Like me, they were told it wasn’t personal, but I can’t think of anything more personal than choosing to hand one person a feast while everyone else gets the scraps. Obviously journalism isn’t uniquely inequitable, but it’s particularly egregious for an industry built on telling the truth to do the complete opposite when it comes to its own mechanics. Journalists intent on exposing everyone else refuse to interrogate themselves, relegating most intel to subtweets or DMs, if it’s online at all.

This is the problem with my tweet, or, why it caused such a fuss. For one thing, I’ll cop to not being very diplomatic. In retrospect, “what in the actual fuck” is not the best way to start a conversation about pay disparity, but if we’re being honest, it’s still probably the best way to get it noticed. For another thing, I was calling out an individual who is beloved by the journalism community. It didn’t matter that I wasn’t taking issue with her personally (quid pro quo), that I was highlighting her comment as an example of a systemic issue, that it was the system I had a problem with — nope, nope, nope. What mattered was that in an industry in which it is frowned upon to even side-eye your colleagues in public, I put the word “fuck” within the vicinity of a marquee writer’s name. And I was a nobody. Which is why it became Taffy and her allies versus “the freelancers.” The dominant side had a face, the other side did not. The star reporter once again came out on top, buoyed by a nebulous mass of forgettable freelancers.

Her supporters were loud as fuck, but when you actually looked at what they were saying it literally boiled down to: Taffy Brodesser-Akner is astronomically talented, which is why she is making astronomically more than you, who are not talented, and how dare you say women should be transparent about money then punish her for doing just that, have you even seen how much men make? I mean, what in the actual fuck are you talking about? This is not about one woman. It’s not even about gender equality (for once). It’s about exploitation. For all I care, Taffy Brodesser-Akner could be Michael Lewis with his $10 a word. The point is the same either way — it’s one journalist making several (many several) times what the rest of us do in an industry in which we’re constantly being told there is nothing left to give. Clearly there is, it just happens to be reserved for an exclusive group of self-congratulatory writers and editors benefiting from a corrupt system. And if you dare point out the unfairness of their profit, the whole lot becomes reflexively defensive, distracting from the real issue because it’s their loss and everyone else’s gain if it’s ever addressed. So let’s just attribute $4 a word to a woman achieving against all odds — yaaass, queen!— and move on.

Uhm, okay, but if $4 a word makes you a queen, does that make the rest of us serfs? And why are the serfs mostly, like, LGBTQ writers, people of color, and women in independent publishing? Distressingly, some women seem to have bought into the idea that they make a lot less than certain writers because they are way less talented and hardworking, but I’m finding it hard to believe that so many marginalized writers are less talented and hardworking than so many white people. Am I suggesting the system might be rigged in favor of upwardly mobile white journalists in the vicinity of New York and their upwardly mobile white friends in the vicinity of New York who run the industry? (Could this explain why the Times reviewed its own staff writer’s book and interviewed her on top of that?) Possibly? Maybe? No? Come on! We’ve been banging on about intersectionality and privilege for the past 100 years (it feels like). Has none of that penetrated? Because if one more person suggests that maybe I should just ask for $4 next time, as though I’m not already risking assignments every time I beg for 50 cents, as though organizations aren’t systematically standing in the way of the ability to negotiate, I swear … Just take one look at that clause Vox has been slipping into their contracts, the one preventing freelancers from sharing their rates publicly in order to get better (read: fair) ones. Are you really going to argue that a system that situates the Taffys — and sure, the Michael Lewises — of the world above the rest of us, apart from us, making wads more cash for their “talent and hard work,” is in any way ethical?

I mean, you could just say nothing, which a lot of journalists did. Writers I’d been cordial with unfollowed me. Writers I thought were actual friends said nothing, which I took to be complicity with the elite journalists, whose ranks they were one day hoping to join, or maybe who they were just trying not to piss off. Writers I hung out with weren’t even sure I wasn’t just being a dick. The ones who supported me, who even DM’d me, were overwhelmingly women of color, queer women, and women who had been serially underrecognized, not to mention a couple of guys who’ve been pushed past the point of giving a fuck. On their timelines, a number of the women indicated that everything that needed to be said about the elites could be found in their mischaracterizations of the $4 a word conversation. That these women predominantly used subtweets to make that point publicly implies that, as mad as they were, they were also aware that those same elites still controlled their livelihoods. The irony is that the same people who accused me of being anti-feminist for trying to talk about pay gaps (yes, that’s as stupid as it sounds), were all over Jezebel’sThe Lie of Feminist Meritocracy.” It’s an instance of bold-faced hypocrisy I can only explain by the fact that the piece was written generally enough that they could revert to performative protest without threatening their own position in line for the brass ring.

“Hey I’ve been working all day and off Twitter. Did I miss anything?” Taffy tweeted jokingly the day after the Twitter shitstorm rolled in. A few days later, in an interview with BuzzFeed’s morning show, she called it a disservice to pay transparency, before refocusing the conversation on her emotional support network of defenders. “I had the warmest kindest weekend on Twitter, where I found out that all these people admired me and liked me. I was like, ‘I love Twitter,’” she said, concluding, “It was a really great moment for me.” The coup de grace came right at the end, when she mentioned that at the time it all went down, she’d been lonely and in a terrible hotel in Atlantic City writing a terrible story: “That could be why I get $4 a word.” Oh, girl. There are journalists actually putting their lives on the line for a shot at $1 a word, maybe, if they’re lucky. Christ. I mean, you could say I’ve got sour grapes or envy or jealousy or, I don’t know, a hysterical obsession … with … what? Basic human decency? I can’t imagine how many marginalized journalists seethed at the idea that innate ability and a little elbow grease were the reason a select few journalists made several times more than their pittance. Where was the acknowledgment that those same people were almost always friends with the gatekeepers, that those gatekeepers almost exclusively share their friends’ work, which gets them more work, which leads to better work, which gets them book deals, which leads to higher salaries, ad infinitum?

***

Taffy and I kind of came up as freelancers around the same time — we were friendly if not actually friends. Dying to do work like hers, I emailed her in 2014 and asked for advice. I explained that, despite all my efforts, I hadn’t gotten anywhere near the kinds of bylines she had and I was still struggling financially. She was generous. She mentioned being relentless and lunching with editors. So I tried harder. I even lunched with a few people. Two years later, I received an email from her out of the blue. Bright Wall/Dark Room had just published my essay on the two sides of Christian Slater. I had pitched the profile months earlier in March, but it had been turned down by a number of publications, including GQ and the Times (Taffy freelanced for both at the time). BuzzFeed had offered me $400 for 3,000 words but I said no. By the time June rolled around, even that option had passed me by, but I really wanted to write the piece so I pitched BW/DR and I took $100 for it. I asked for more, but being such a small outlet they honestly didn’t have the money. So, yeah: $100 for 3,000 words. That’s $.03 a word. I figured I wouldn’t be granted an interview with Slater, who I had followed for three decades, and for such a small fee I didn’t bother going to the trouble. But I researched to make up for it and wrote the profile anyway, partly while juggling a holiday in Tobermory — I remember everyone going out to the water while I edited in a slice of sun in the cottage. The piece went up July 11th. Taffy emailed me a day later to congratulate me — she had just gone to proof at GQ on what she described as an identical piece. She regretted coming second. That is to say, I literally had Taffy herself telling me that I had beaten her at her own game, despite playing with less. Of course, she was probably paid a little more than $100. In fact, if she was already making $4 a word at the time, that would have amounted to $17,000 — 170 times my fee. As I was saying, what in the actual fuck.

* * *

Soraya Roberts is a culture columnist at Longreads.

Maybe What We Need Is … More Politics?

Alfred Gescheidt / Getty Images

Aaron Timms | Longreads | February 2019 | 20 minutes (5,514 words)

Alpacas are native to South America, but to find the global center of alpaca spinning you’ll need to travel to Bradford, England. The man most responsible for this quirk of history is Titus Salt. Until the 1830s alpaca yarn was considered an unworkable material throughout Europe. Salt, a jobbing young entrepreneur from the north of England, commercialized a form of alpaca warp that made the animal’s fleece suitable for mass production. Within a decade alpaca, finer and softer than wool, had become the rage of England’s fashionable classes.

Already by the mid-19th century industrialization had begun to disfigure the English countryside with “machinery and tall chimneys, out of which interminable serpents of smoke trailed themselves for ever and ever, and never got uncoiled,” as Dickens put it in Bleak House. The immiseration of the working classes was under way. Troubled by the emerging horrors of the new industrial age, Salt built a model village to house the workers he employed in his textile mill. Saltaire, with its neat, spacious houses, running water, efficient sewerage, parks, schools and recreational facilities, became a symbol of what enlightened capitalism could look like. It was also a model in the truest sense, serving as the inspiration for workers’ villages built later in the 19th century by companies such as Cadbury’s and Lever Brothers, the soap manufacturer that eventually became Unilever.

According to economist Paul Collier, these Victorian capitalists instituted a tradition that survives, however precariously, today: the tradition of “business with purpose, business with a sense of obligation to a workforce and a community.” Among the modern successors of this model of compassionate capitalism, Collier has argued, are U.S. pharmaceutical giant Johnson & Johnson and John Lewis & Partners, the British department store. In the 1940s Johnson & Johnson set out a credo stating that the company’s first responsibility was to its customers. Thanks to this credo, Johnson & Johnson’s management led a mass recall of Tylenol off supermarket and pharmacy shelves following a contamination scare in the early 1980s. Now standard practice, this type of product recall was uncommon for its time — and allowed the company to maintain goodwill with its customers. John Lewis, for its part, has prospered through difficult decades for brick-and-mortar retail largely thanks to its unusual power structure: the company is owned by a trust run in the interests of its workforce.

The thread uniting this strain of capitalism, Collier contends in his new book The Future of Capitalism: Facing The New Anxieties, is ethics. An ethics of reciprocal responsibility and care — between owners, workers, and customers — has allowed different businesses to prosper in different eras without destroying the communities and environments around them. But very few businesses are run according to these principles today. According to Collier, it is to this model of reciprocal ethics that capitalism, having lost its way over the past four decades, now must return — and reciprocity must become the principle that guides human interaction at all levels of society, not just in the firm. “Our sense of mutual regard has to be rebuilt,” he says. “Public policy needs to be complemented by a sense of purpose among firms.” “We need to meet each other.” “A new generation needs to reset social narratives.” “Norms need to change.” Prescriptivism today, the future of capitalism tomorrow. Read more…

The Weather and the Wall

iStock / Getty Images Plus, Unsplash, Photo illustration by Katie Kosma

Will Meyer | Longreads | January 2019 | 15 minutes (4,073 words)

“At the museum steps
Didn’t we establish
That all this blood is not a dream
This is progress
And we are not that high
We could almost be redeemed”

 — unreleased song by The Lentils

*

For years, changes in butterfly populations and migrations have been considered an “early warning indicator” of global warming. In 2006, a British butterfly specialist told The New Yorker’s Elizabeth Kolbert that of 10 species living in Southern England at the time, “Every single one has moved northward since 1982.”

Now, several years and many missed early warning indicators later, the National Butterfly Center in Mission, Texas, has received a letter from Customs and Border Protection announcing the government’s intent to build a border wall through critical habitat for 240 species of butterflies and 300 types of birds. The letter explains that the wall will be 36-feet tall and 20-feet wide, and that an additional 150 feet south of the border will be cleared of all vegetation to create an “enforcement zone.” Comparing the wall’s construction with a calamitous weather event, the National American Butterfly Association president told the San-Antonio Express News that: “For us to financially survive and weather this storm, we’re trying to create a fund that will be kind of like an endowment.” As of this writing, a GoFundMe created to protect the Center has raised just over $24,000.

Meanwhile, given that Mexico hasn’t “paid for it” and won’t, a GoFundMe to finance the wall’s construction raised $20.5 million dollars before GoFundMe decided to offer refunds. That’s nowhere near enough money to actually build the thing, but enough to make you pretty sure the butterflies don’t stand a chance. Indeed, the president and the Republican-controlled Senate have shut down large swaths of the government for over a month, demanding that the Democrats in the House vote to pay for the wall before the government can be reopened. Still, it’s hard to believe the wall is really going up.
Read more…

If the Rich Really Want To ‘Do Good,’ They Should Become Class Traitors Like FDR

FPG / Getty, Photo illustration by Katie Kosma

Will Meyer | Longreads | October 2018 | 11 minutes (2,846 words)

In July of 2015, writer and ex-McKinsey consultant Anand Giridharadas addressed a room full of elites and their good company in Aspen, Colorado. He was a fellow with The Aspen Institute, a centrist think-tank, which was hosting an “ideas festival.” Giridharadas’ talk took aim at what he dubbed the “Aspen Consensus,” an ideological paradigm in which elites “talk a lot about giving more” and not “about taking less.” He earnestly questioned the social change efforts and “win-win” do-goodery promulgated at the business-friendly get-together. In the speech, Giridharadas walked a thin line: both praising the Aspen community which “meant so much” to him and his wife while also laying into its culture and commandments. He dropped the mic: “We know that enlightened capital didn’t get rid of the slave trade,” and suggested that the “rich fought for policies that helped them stack up, protect and bequeath [their] money: resisting taxes on inheritances and financial transactions, fighting for carried interest to be taxed differently from income, insisting on a sacred right to conceal money in trusts, shell companies and weird islands.”

The talk received a standing ovation, though certainly ruffled some feathers as well. An attendee confided in Giridharadas that he was speaking to their central struggle in life and others gave him icy glares and called him an “asshole” at the bar. The conservative New York Times columnist David Brooks wrote about the speech — which had hardly prescribed any policies — and clearly felt so threatened by it that his resulting column was titled “Two Cheers for Capitalism,” and attempted, albeit poorly, to nip any systemic critique of his favored economic system in the bud. But Brooks too realized that there would be a “coming debate about capitalism,” and his column prompted Giridharadas to post his talk online, stirring lots of debate — not quelching it. Read more…

The Myth of the Singular Voice

American actor Denzel Washington on the set of Glory, based on the book by Lincoln Kirstein, and directed by Edward Zwick. (Photo by TriStar Pictures/Sunset Boulevard/Corbis via Getty Images)

In an incisive essay for the Baffler, political scientist Adolph Reed considers how many pop culture creations made with Black audiences in mind — including films like Birth of a Nation, Selma, and Black Panther —  are narratives of singular, heroic, often male, achievement. “Tales of inspiration and uplift,” he calls them. Meanwhile, films like Glory, which hinged on its historical accuracy, were received by some with considerably less enthusiasm:

Occasionally on a boring flight, I’ll rewatch the Battle of James Island scene from the magnificent 1989 film, Glory. The scene depicts the first engagement of the 54th Massachusetts Infantry, the first Northern regiment of black troops organized by the Army of the United States to fight against the Confederate insurrection. James Island was a fateful battle outside Charleston, SC, on July 16, 1863. I pulled up the clip on a recent flight and was moved yet again by the powerful imagery of black men finally able to strike a blow against the slaveocracy. Imagining what that felt like for the soldiers of the 54th is always intensely gratifying.

Watching it this time, I remembered how startled I had been when Glory was released to learn that many people, including blacks and people on the left, dismissed or even disparaged the film as a “white savior narrative”—a phrase that is now a routine derogation of certain cross-racial sagas of resistance to white supremacy. In Glory’s case, this complaint arose mainly in response to the (historically accurate) depiction of the regiment’s commanding officers as Northern whites.

This objection left me dumbfounded. After all, the 54th Massachusetts was a real historical entity. As a compromise to ensure political support, it was stipulated that its officers be white. Nonetheless, prominent abolitionists, including Frederick Douglass and the free black community of Boston, were enthusiastic about its formation and instrumental in recruiting its ranks.

Now I think I understand. I’ve long suspected that, to a certain strain of race-conscious or antiracist discourse, historical exploration in popular culture was less important than the propagation of tales of inspiration and uplift. These fables typically feature singular black heroes who have overcome crushing racist adversity against all odds. In recent years, a steady stream of films and other narratives have openly embraced that preference.

He suggests these escapist portrayals echo what is disturbingly superficial about our current drive to uplift diverse leadership and voices in media. “Winning anything politically — policies or changes in power relations — is not the point.,” Reed writes.

Decisions by blacks to support nonblack candidates or social policies not expressed in race-first terms are interpreted as evidence of flawed, limited, misguided, or otherwise co-opted black agency. The idea that blacks, like everyone else, make their history under conditions not of their own choosing becomes irrelevant, just another instance of insufficient symbolic representation.

The notion that black Americans are political agents just like other Americans, and can forge their own tactical alliances and coalitions to advance their interests in a pluralist political order is ruled out here on principle. Instead, blacks are imagined as so abject that only extraordinary intervention by committed black leaders has a prayer of producing real change. This pernicious assumption continually subordinates actually existing history to imaginary cultural narratives of individual black heroism and helps drive the intense—and myopic—opposition that many antiracist activists and commentators express to Bernie Sanders, social democracy, and a politics centered on economic inequality and working-class concerns.

Read the essay

The Far Right’s Fight Against Race-Conscious School Admissions

WASHINGTON, DC - OCTOBER 10: Attorney Bert Rein (L), speaks to the media while standing with plaintiff Abigail Noel Fisher (R), after the U.S. Supreme Court heard arguments in her caseon October 10, 2012 in Washington, DC. The high court heard oral arguments on Fisher V. University of Texas at Austin and are tasked with ruling on whether the university's consideration of race in admissions is constitutional. (Photo by Mark Wilson/Getty Images)

Late in the afternoon on July 3, the Department of Justice announced it was rescinding 24 documents issued by the Obama administration between 2011 and 2016. The documents  offered guidance to a range of constituencies, including homeowners, law enforcement, and employers. Some detailed employment protections for refugees and asylees; seven of the 24 discussed policies and Supreme Court rulings on race-conscious admissions practices in elementary, secondary, and post-secondary schools. In its statement, the DOJ called the guides “unnecessary, outdated, inconsistent with existing law, or otherwise improper.”

No immediate policy change will come from the documents’ removal. It’s more of a signal, a gesture in a direction, a statement about ideology. The Trump administration has already enacted several hard-line positions on immigration. And the Sessions-backed Justice Department has made a habit of signaling, by way of gesture, its opposition to affirmative action, and its belief that race-conscious policies, specifically, often amount to acts of discrimination.

***

The term “affirmative action” is ambiguous and has never been strictly defined. It’s a collection of notions, gestures, and ideas that existed before its present-day association with race. According to Smithsonian, the term was likely first used in the Depression-era Wagner Act. This legislation aimed to end harmful labor practices and encourage collective bargaining. It also mandated that employers found in violation “take affirmative action including reinstatement of employees with or without backpay” to prevent the continuation of harmful practices. The reinstatement and payment of dismissed employees were affirmative gestures that could be taken to right a wrong.

Nearly a decade later, in 1941, under pressure from organizer A. Philip Randolph, President Franklin D. Roosevelt issued Executive Order 8802 to prohibit race-based discrimination in the defense industries during the buildup to WWII. It is considered the first federal action to oppose racial discrimination since Reconstruction, and paved the way for President John F. Kennedy, who was the first to use “affirmative action” in association with race in Executive Order 10925. Kennedy’s order instructed government contractors to take “affirmative action to ensure that applicants are employed,” regardless of “race, creed, color, or national origin.” President Lyndon B. Johnson expanded the scope of Kennedy’s order to add religion when he issued Executive Order 11246 in 1965. Two years later, Johnson amended his own document to include sex on the list of protected attributes.

It was Republican president Richard Nixon who expanded the use of affirmative actions to ensure equal employment in all facets of government in 1969, when he issued Executive Order 11478. Nixon ran for office in 1968 on “law and order” and “tough on crime” messaging. He believed what he called “black capitalism” –- the idea of thriving black communities with high rates of employment and entrepreneurship — would ease the agitations of civil rights groups and end urban unrest. At the time, Nixon’s rhetoric won the support of a smattering of black cultural figures such as James Brown. “Black capitalism” was little more than a co-optation of some of the tenets of Black Power, which itself had come from a long-established line of conservative black political thought that emphasized economic empowerment and independence, self-determination and personal responsibility. In his version, Nixon envisioned only a slight role for the federal government; without the push of significant government investment, the policies and programs he created didn’t result in sweeping change. Still, shadows of Nixon’s thinking on black economics endured: They’re present in multiple speeches Obama made to black audiences during his presidency; Jay Z’s raps about the transformative, generational effects of his wealth; Kanye West’s TMZ and Twitter rants. Also, the backlash Nixon faced is remarkably similar in tone and content to today’s challenges to affirmative action, which typically involve a white person’s complaints about the incremental gains made by members of a previously disadvantaged group:

In 1969 Section 8(a) of the Small Business Act authorized the SBA to manage a program to coordinate government agencies in allocating a certain number of contracts to minority small businesses—referred to as procurements or contract “set-asides.” Daniel Moynihan, author of the controversial Moynihan Report, helped shape the program. By 1971 the SBA had allocated $66 million in federal contracts to minority firms, making it the most robust federal aid to minority businesses. Still, the total contracts given to minority firms amounted to only .1 percent of the $76 billion in total federal government contracts that year.

Yet even these miniscule minority set-asides immediately faced backlash from blue-collar workers, white construction firms, and conservatives, who called them “preferential treatment” for minorities. Ironically, multiple studies revealed that 20 percent of these already meager set-asides ended up going to white-owned firms.

***

A sense of lost advantage and power seems to animate both historical and recent challenges to race-based policies and practices. In Regents of University of California v. Bakke (1978) the first affirmative action case the Supreme Court ruled on, Allan Bakke, a white University of California at Davis medical school applicant, sued the school after being twice denied admission. The school had created a system to set aside a certain number of spaces for students from marginalized groups. The Court decided practices that relied on quota systems were unconstitutional, but it upheld the use of race in admissions decisions as long as it was among a host of other factors. Rulings in subsequent cases, such as Grutter v. Bollinger (2003) and most recently, Fisher v. University of Texas (2016) supported the use of race in admissions and reiterated the federal government’s interest in the diversity of the nation’s institutions. In the most-recent case, now-retired Justice Anthony Kennedy provided the Court’s swing vote.

Plaintiffs in affirmative action challenges tend to argue race-conscious admissions policies violate rights granted by the Fourteenth Amendment, especially its clause guaranteeing “equal protection of the laws.” Ratified 150 years ago last week, the Fourteenth Amendment established birthright citizenship and defined citizenship’s parameters. Its ideas originated in the years leading up to Reconstruction, during “colored conventions” held among African American leaders and activists,  and form the underpinnings of Brown v. Board Education (1954) and some provisions of the Civil Rights Act of 1964.

One of the most prominent opponents of affirmative action, Edward Blum, a fellow at the American Enterprise Institute, actively seeks and recruits aggrieved plaintiffs and attorneys to challenge race-based policies in school admissions and voting practices. Blum was the force behind the complaint of Abagail Fisher, the white student at the center of Fisher v. University of Texas. According to the New York Times:

In the Texas affirmative action case, he told a friend that he was looking for a white applicant to the University of Texas at Austin, his own alma mater, to challenge its admissions criteria. The friend passed the word to his daughter, Abigail Fisher. About six months later, the university rejected Ms. Fisher’s application.

“I immediately said, ‘Hey, can we call Edward?’” she recalled in an interview.

The case went to the Supreme Court twice, and though Ms. Fisher was portrayed as a less than stellar student, vilified as supporting a racist agenda, and ultimately lost, she said she still believed in Mr. Blum. “I think we started a conversation,” she said. “Edward obviously is not going to just lie down and play dead.”

Blum’s first lawsuit came about after he lost a Congressional election in Houston because, he felt, the boundaries of his district were drawn solely along racial lines. He is now behind lawsuits against Harvard University and the University of North Carolina at Chapel Hill, which allege the schools’ admissions policies discriminate against Asian American applicants. It is interesting and bold to use white women and Asian American students to dismantle programs meant to address America’s legacy of discrimination. Both groups have benefited significantly from Reconstruction and Civil Rights-era policies and legislation. Do Blum, Sessions, and their supporters believe race-based policies are irrelevant, illegal, or improper because for many, they’ve worked? I sense something more nefarious at play, such as a mounting sense of loss and growing resentment that the demographic shifts in our country also mean inevitable shifts in who holds power.

The Sessions-helmed Justice Department’s signals and the nomination of Judge Brett Kavanaugh to the high court, have, I’m sure, heartened activists like Blum. For the Nation, Eric Foner wrote about how the Fourteenth amendment’s ambiguity is what allows it to be used in a way that is so at odds with the spirit of its origins. It is that ambiguity, he says, that will allow, someday, in a different political climate, for another era of correction.

Sources and further reading:

Private Telegram, Public Strife

Telegram App, Wikimedia Commons

Jacob Silverman | Longreads | July 2018 | 10 minutes (2,418 words)

Telegram, a messaging app with more than 200 million users, is a company known for its rakish independence. Pavel Durov, who created the app with his brother, Nikolai, is a 33-year old from St. Petersburg, Russia, with a taste for dark suits and tax-free municipalities. In 2006, he founded VKontakte (VK), Russia’s answer to Facebook, which quickly became the country’s largest social network and a target of its security services. Durov, who identifies as a “part-time troll” in his Twitter bio, earned a reputation as a sort of maverick entrepreneur, a persona that has come with both free-speech absolutism and immature antics. His most notorious stunt took place in May 2012, when he stood at his office window and tossed paper airplanes made of rubles down onto the street below. He later explained that he had been talking to a vice president at his company who had been awarded a large bonus, and when the VP said that he didn’t care about money, the two decided to throw cash out the window—until bystanders started fighting over the windfall.

Read more…