Search Results for: The American Scholar
The Power and Business of Hip-Hop: A Reading List on an American Art Form

Ever since Black and Latino Americans created hip-hop at south Bronx block parties during the 1970s, this highly original, uniquely American music has continued to evolve, while simultaneously taking root in countless countries throughout the world.
As cultural critic Harry Allen once said: “hip hop is the new jazz.” But like jazz, hip-hop is more than music. It’s a culture. “’Hip-hop,’ once a noun,“ Kelefa Sanneh wrote in The New Yorker, “has become an adjective, constantly invoked, if rarely defined; people talk about hip-hop fashion and hip-hop novels, hip-hop movies and hip-hop basketball. Like rock and roll in the nineteen-sixties, hip-hop is both a movement and a marketing ploy, and the word is used to describe almost anything that’s supposed to appeal to young people.“ Beyond marketing and corporatization, hip-hop culture has always included dance, rap, fashion, design, stretching language, reclaiming public spaces, and its creative, genre-spanning approach has allowed artists to represent their lives in a world that often ignores or misrepresents them. In the San Francisco Gate in 2003, Adam Mansbach, author of Go the F**k To Sleep described hip-hop culture as “assembled from spare parts, ingeniously and in public. Paint cans refitted with oven-cleaner nozzles transformed subway trains into mobile art galleries. Playgrounds and parks became nightclubs; turntables and records became instruments. Scraps of linoleum and cardboard became dance floors. Verbal and manual dexterity turned kids into stars, and today’s artists grew up listening to the first strains of the musical form.” As Jeff Chang, author of Can’t Stop, Won’t Stop, put it, hip-hop culture is “naturally interdisciplinary” and composed of “mix signifiers, we break everything down to bits and bytes and rebuild something new.” I love the description.
Read more…
When American Media Was (Briefly) Diverse

Danielle A. Jackson | Longreads | September 2019 | 16 minutes (4,184 words)
The late summer night Tupac died, I listened to All Eyez on Me at a record store in an East Memphis strip mall. The evening felt eerie and laden with meaning. It was early in the school year, 1996, and through the end of the decade, Adrienne, Jessica, Karida and I were a crew of girlfriends at our high school. We spent that night, and many weekend nights, at Adrienne’s house.
Our public school had been all white until a trickle of black students enrolled during the 1966–67 school year. That was 12 years after Brown v. Board of Education and six years after the local NAACP sued the school board for maintaining dual systems in spite of the ruling. In 1972, a federal district court ordered busing; more than 40,000 white students abandoned the school system by 1980. The board created specialized and accelerated courses in some of its schools, an “optional program,” in response. Students could enter the programs regardless of district lines if they met certain academic requirements. This kind of competition helped retain some white students, but also created two separate tracks within those institutions — a tenuous, half-won integration. It meant for me, two decades later, a “high-performing school” with a world of resources I knew to be grateful for, but at a cost. There were few black teachers. Black students in the accelerated program were scattered about, small groups of “onlies” in all their classes. Black students who weren’t in the accelerated program got rougher treatment from teachers and administrators. An acrid grimness hung in the air. It felt like being tolerated rather than embraced.
My friends and I did share a lunch period. At our table, we traded CDs we’d gotten in the mail: Digable Planets’s Blowout Comb, D’Angelo’s Brown Sugar, the Fugees’ The Score. An era of highly visible black innovation was happening alongside a growing awareness of my own social position. I didn’t have those words then, but I had my enthusiasms. At Maxwell’s concert one sweaty night on the Mississippi, we saw how ecstasy, freedom, and black music commingle and coalesce into a balm. We watched the films of the ’90s wave together, and while most had constraining gender politics, Love Jones, the Theodore Witcher–directed feature about a group of brainy young artists in Chicago, made us wish for a utopic city that could make room for all we would become.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
We also loved to read the glossies — what ’90s girl didn’t? We especially salivated over every cover of Vibe. Adrienne and I were fledgling writers who experimented a lot and adored English class. In the ’90s, the canon was freshly expanding: We read T.S. Eliot alongside Kate Chopin and Chinua Achebe. Something similar was happening in magazines. Vibe’s mastheads and ad pages were full of black and brown people living, working, and loving together and out front — a multicultural ideal hip-hop had made possible. Its “new black aesthetic” meant articles were fresh and insightful but also hyper-literary art historical objects in their own rights. Writers were fluent in Toni Morrison and Ralph Ellison as well as Biggie Smalls. By the time Tupac died, Kevin Powell had spent years contextualizing his life within the global struggle for black freedom. “There is a direct line from Tupac in a straitjacket [on the popular February 1994 cover] to ‘It’s Obama Time’ [the September 2007 cover, one of the then senator’s earliest],” former editor Rob Kenner told Billboard in a Vibe oral history. He’s saying Vibe helped create Obama’s “coalition of the ascendent” — the black, Latinx, and young white voters who gave the Hawaii native two terms. For me, the pages reclaimed and retold the American story with fewer redactions than my history books. They created a vision of what a multiethnic nation could be.
* * *
“There was a time when journalism was flush,” Danyel Smith told me on a phone call from a summer retreat in Massachusetts. She became music editor at Vibe in 1994, and was editor in chief during the late ’90s and again from 2006 to 2008. The magazine, founded by Quincy Jones and Time, Inc. executives in 1992, was the “first true home of the culture we inhabit today,” according to Billboard. During Smith’s first stint as editor in chief, its circulation more than doubled. She wrote the story revealing R. Kelly’s marriage to then 15-year-old Aaliyah, as well as cover features on Janet Jackson, Wesley Snipes, and Whitney Houston. Smith was at the helm when the magazine debuted its Obama covers in 2007 — Vibe was the first major publication to endorse the freshman senator. When she described journalism as “flush,” Smith was talking about the late ’80s, when she started out in the San Francisco Bay. “Large cities could support with advertising two, sometimes three, alternative news weeklies and dailies,” she said.
‘There is a direct line from Tupac in a straitjacket [on the popular February 1994 cover] to ‘It’s Obama Time’ [the September 2007 cover, one of the then senator’s earliest].’
The industry has collapsed and remade itself many times since then. Pew reports that between 2008 and 2018, journalism jobs declined 25 percent, a net loss of about 28,000 positions. Business Insider reports losses at 3,200 jobs this year alone. Most reductions have been in newspapers. A swell in digital journalism has not offset the losses in print, and it’s also been volatile, with layoffs several times over the past few years, as outlets “pivot to video” or fail to sustain venture-backed growth. Many remaining outlets have contracted, converting staff positions into precarious freelance or “permalance” roles. In a May piece for The New Republic, Jacob Silverman wrote about the “yawning earnings gap between the top and bottom echelons” of journalism reflected in the stops and starts of his own career. After a decade of prestigious headlines and publishing a book, Silverman called his private education a “sunken cost” because he hadn’t yet won a coveted staff role. If he couldn’t make it with his advantageous beginnings, he seemed to say, the industry must be truly troubled. The prospect of “selling out” — of taking a corporate job or work in branded content — seemed more concerning to him than a loss of the ability to survive at all. For the freelance collective Study Hall, Kaila Philo wrote how the instability in journalism has made it particularly difficult for black women to break into the industry, or to continue working and developing if they do. The overall unemployment rate for African Americans has been twice that of whites since at least 1972, when the government started collecting the data by race. According to Pew, newsroom employees are more likely to be white and male than U.S. workers overall. Philo’s report mentions the Women’s Media Center’s 2018 survey on women of color in U.S. news, which states that just 2.62 percent of all journalists are black women. In a write-up of the data, the WMC noted that fewer than half of newspapers and online-only newsrooms had even responded to the original questionnaire.
* * *
According to the WMC, about 2.16 percent of newsroom leaders are black women. If writers are instrumental in cultivating our collective conceptions of history, editors are arguably more so. Their sensibilities influence which stories are accepted and produced. They shape and nurture the voices and careers of writers they work with. It means who isn’t there is noteworthy. “I think it’s part of the reason why journalism is dying,” Smith said. “It’s not serving the actual communities that exist.” In a July piece for The New Republic, Clio Chang called the push for organized labor among freelancers and staff writers at digital outlets like Vox and Buzzfeed, as well as at legacy print publications like The New Yorker, a sign of hope for the industry. “In the most basic sense, that’s the first norm that organizing shatters — the isolation of workers from one another,” Chang wrote. Notably, Vox’s union negotiated a diversity initiative in their bargaining agreement, mandating 40 to 50 percent of applicants interviewed come from underrepresented backgrounds.
“Journalism is very busy trying to serve a monolithic imaginary white audience. And that just doesn’t exist anymore,” Smith told me. U.S. audiences haven’t ever been truly homogeneous. But the media institutions that serve us, like most facets of American life, have been deliberately segregated and reluctant to change. In this reality, alternatives sprouted. Before Vibe’s launch, Time, Inc. executives wondered whether a magazine focused on black and brown youth culture would have any audience at all. Greg Sandow, an editor at Entertainment Weekly at the time, told Billboard, “I’m summoned to this meeting on the 34th floor [at the Time, Inc. executive offices]. And here came some serious concerns. This dapper guy in a suit and beautifully polished shoes says, ‘We’re publishing this. Does that mean we have to put black people on the cover?’” Throughout the next two decades, many publications serving nonwhite audiences thrived. Vibe spun off, creating Vibe Vixen in 2004. The circulations of Ebony, JET, and Essence, legacy institutions founded in 1945, 1951, and 1970, remained robust — the New York Times reported in 2000 that the number of Essence subscribers “sits just below Vogue magazine’s 1.1 million and well above the 750,000 of Harper’s Bazaar.” One World and Giant Robot launched in 1994, Latina and TRACE in 1996. Honey’s preview issue, with Lauryn Hill on the cover, hit newsstands in 1999. Essence spun off to create Suede, a fashion and culture magazine aimed at a “polyglot audience,” in 2004. A Magazine ran from 1989 to 2001; Hyphen launched with two young reporters at the helm the following year. In a piece for Columbia Journalism Review, Camille Bromley called Hyphen a celebration of “Asian culture without cheerleading” invested in humor, complication, and complexity, destroying the model minority myth. Between 1956 and 2008, the Chicago Defender, founded in 1905 and a noted, major catalyst for the Great Migration, published a daily print edition. During its flush years, the Baltimore Afro-American, founded in 1892, published separate editions in Philadelphia, Richmond, and Newark.
Before Vibe’s launch, Time, Inc. executives wondered whether a magazine focused on black and brown youth culture would have any audience at all.
The recent instability in journalism has been devastating for the black press. The Chicago Defender discontinued its print editions in July. Johnson Publications, Ebony and JET’s parent company, filed bankruptcy earlier this year after selling the magazines to a private equity firm in 2016. Then it put up for sale its photo archive — more than 4 million prints and negatives. Its record of black life throughout the 20th century includes images of Emmett Till’s funeral, in which the 14-year-old’s mutilated body lay in state, and Moneta Sleet Jr.’s Pulitzer Prize–winning image of Coretta Scott King mourning with her daughter, Bernice King. It includes casually elegant images of black celebrities at home and shots of everyday street scenes and citizens — the dentists and mid-level diplomats who made up the rank and file of the ascendant. John H. Johnson based Ebony and JET on LIFE, a large glossy heavy on photojournalism with a white, Norman Rockwell aesthetic and occasional dehumanizing renderings of black people. Johnson’s publications, like the elegantly attired stars of Motown, were meant as proof of black dignity and humanity. In late July, four large foundations formed an historic collective to buy the archive, shepherd its preservation, and make it available for public access.
The publications’ written stories are also important. Celebrity profiles offered candid, intimate views of famous, influential black figures and detailed accounts of everyday black accomplishment. Scores of skilled professionals ushered these pieces into being: Era Bell Thompson started out at the Chicago Defender and spent most of her career in Ebony’s editorial leadership. Tennessee native Lynn Norment worked for three decades as a writer and editor at the publication. André Leon Talley and Elaine Welteroth passed through Ebony for other jobs in the industry. Taken together, their labor was a massive scholarly project, a written history of a people deemed outside of it.
Black, Latinx, and Asian American media are not included in the counts on race and gender WMC reports. They get their data from the American Society of News Editors (ASNE), and Cristal Williams Chancellor, WMC’s director of communications, told me she hopes news organizations will be more “aggressive” in helping them “accurately indicate where women are in the newsroom.” While men dominate leadership roles in mainstream newsrooms, news wires, TV, and audio journalism, publications targeting multicultural audiences have also had a reputation for gender trouble, with a preponderance of male cover subjects, editorial leaders, and features writers. Kim Osorio, the first woman editor in chief at The Source, was fired from the magazine after filing a complaint about sexual harassment. Osorio won a settlement for wrongful termination in 2006 and went on to help launch BET.com and write a memoir before returning to The Source in 2012. Since then, she’s made a career writing for TV.
* * *
This past June, Nieman Lab published an interview with Jeffrey Goldberg, editor in chief of The Atlantic since 2016, and Adrienne LaFrance, the magazine’s executive editor. The venerable American magazine was founded in Boston in 1857. Among its early supporters were Ralph Waldo Emerson, Nathaniel Hawthorne, Herman Melville, and Harriet Beecher Stowe. It sought to promote an “American ideal,” a unified yet pluralistic theory of American aesthetics and politics. After more than a century and a half of existence, women writers are not yet published in proportion to women’s share of the country’s population. The Nieman piece focused on progress the magazine has made in recent years toward equitable hiring and promoting: “In 2016, women made up just 17 percent of editorial leadership at The Atlantic. Today, women account for 63 percent of newsroom leaders.” A few days after the piece’s publication, a Twitter user screen-capped a portion of the interview where Goldberg was candid about areas in which the magazine continues to struggle:
GOLDBERG: We continue to have a problem with the print magazine cover stories — with the gender and race issues when it comes to cover story writing. [Of the 15 print issues The Atlantic has published since January 2018, 11 had cover stories written by men. — Ed.]
It’s really, really hard to write a 10,000-word cover story. There are not a lot of journalists in America who can do it. The journalists in America who do it are almost exclusively white males. What I have to do — and I haven’t done this enough yet — is again about experience versus potential. You can look at people and be like, well, your experience is writing 1,200-word pieces for the web and you’re great at it, so good going!
That’s one way to approach it, but the other way to approach it is, huh, you’re really good at this and you have a lot of potential and you’re 33 and you’re burning with ambition, and that’s great, so let us put you on a deliberate pathway toward writing 10,000-word cover stories. It might not work. It often doesn’t. But we have to be very deliberate and efficient about creating the space for more women to develop that particular journalistic muscle.
My Twitter feed of writers, editors, and book publicists erupted, mostly at the excerpt’s thinly veiled statement on ability. Women in my timeline responded with lists of writers of longform — books, articles, and chapters — who happened to be women, or people of color, or some intersection therein. Goldberg initially said he’d been misquoted. When Laura Hazard Owen, the deputy editor at Nieman who’d conducted the interview, offered proof that Goldberg’s statements had been delivered as printed, he claimed he had misspoken. Hazard Owen told the L.A. Times she believes that The Atlantic is, overall, “doing good work in diversifying the staff there.”
Taken together, their labor was a massive scholarly project, a written history of a people deemed outside of it.
Still, it’s a difficult statement for a woman writer of color to hear. “You literally are looking at me and all my colleagues, all my women colleagues and all my black colleagues, all my colleagues of color and saying, ‘You’re not really worthy of what we do over here.’ It’s mortifying,” Smith told me. Goldberg’s admission may have been a misstatement, but it mirrors the continued whiteness of mainstream mastheads. It checks out with the Women’s Media Center’s reports and the revealing fact of how much data is missing from even those important studies. It echoes the stories of black women who work or worked in journalism, who have difficulty finding mentors, or who burn out from the weight of wanting to serve the chronically underserved. It reflects my own experiences, in which I have been told multiple times in a single year that I am the only black woman editor that a writer has ever had. But it doesn’t corroborate my long experience as a reader. What happened to the writers and editors and multihyphenates from the era of the multicultural magazine, that brief flash in the 90’s and early aughts when storytellers seemed to reflect just how much people of color lead in creating American culture? Who should have formed a pipeline of leaders for mainstream publications when the industry began to contract?
* * *
In addition to her stints at Vibe, Smith also edited for Billboard, Time, Inc. publications, and published two novels. She was culture editor for ESPN’s digital magazine The Undefeated before going on book leave. Akiba Solomon is an author, editor of two books, and is currently senior editorial director at Colorlines, a digital news daily published by Race Forward. She started an internship at YSB in 1995 before going on to write and edit for Jane, Glamour, Essence, Vibe Vixen, and The Source. She told me that even at magazines without predominantly black staff, she’d worked with other black people, though not often directly. At black magazines, she was frequently edited by black women. “I’ve been edited by Robin Stone, Vanessa DeLuca [formerly editor-in-chief of Essence, currently running the Medium vertical ZORA], Ayana Byrd, Kierna Mayo, Cori Murray, and Michaela Angela Davis.” Solomon’s last magazine byline was last year, an Essence story on black women activists who organize in culturally relevant ways to fight and prevent sexual assault.
Solomon writes infrequently for publications now, worn down by conditions in journalism she believes are untenable. At the hip-hop magazines, the sexism was a deterrent, and later, “I was seeing a turn in who was getting the jobs writing about black music” when it became mainstream. “Once folks could divorce black music from black culture it was a wrap,” she said. At women’s magazines, Solomon felt stifled by “extremely narrow” storytelling. Publishing, in general, Solomon believes, places unsustainable demands on its workers.
When we talk about the death of print, it is infrequent that we also talk about the conditions that make it ripe for obsolescence. The reluctant slowness with which mainstream media has integrated its mastheads (or kept them integrated) has meant the industry’s content has suffered. And the work environments have placed exorbitant burdens on the people of color who do break through. In Smith’s words:
You feel that you want to serve these people with good and quality content, with good and quality graphics, with good and quality leadership. And as a black person, as a black woman, regardless of whether you’re serving a mainstream audience, which I have at a Billboard and at Time, Inc., or a multicultural audience, which I have at Vibe, it is difficult. And it’s actually taken me a long time to admit that to myself. It does wear you down. And I ask myself why have I always, always stayed in a job two and a half to three years, especially when I’m editing? It’s because I’m tired by that time.
In a July story for Politico, black journalists from The New York Times and the Associated Press talked about how a sophisticated understanding of race is critical to ethically and thoroughly covering the current political moment. After the August 3 massacre in El Paso, Lulu Garcia-Navarro wrote how the absence of Latinx journalists in newsrooms has created a vacuum that allows hateful words from the president to ring unchallenged. Lacking the necessary capacity, many organizations cover race related topics, often matters of life and death, without context or depth. As outlets miss the mark, journalists of color may take on the added work of acting as the “the black public editor of our newsrooms,” Astead Herndon from the Times said on a Buzzfeed panel. Elaine Welteroth wrote about the physical exhaustion she experienced during her tenure as editor in chief at Teen Vogue in her memoir More Than Enough. She was the second African American editor in chief in parent company Condé Nast’s 110 year history:
I was too busy to sleep, too frazzled to eat, and TMI: I had developed a bizarre condition where I felt the urge to pee — all the time. It was so disruptive that I went to see a doctor, thinking it may have been a bladder infection.
Instead, I found myself standing on a scale in my doctor’s office being chastised for accidentally dropping nine more pounds. These were precious pounds that my naturally thin frame could not afford to lose without leaving me with the kind of bony body only fashion people complimented.
Condé Nast shuttered Teen Vogue’s print edition in 2017, despite record-breaking circulation, increased political coverage, and an expanded presence on the internet during Welteroth’s tenure. Welteroth left the company to write her book and pursue other ventures.
Mitzi Miller was editor in chief of JET when it ran the 2012 cover story on Jordan Davis, a Florida teenager shot and killed by a white vigilante over his loud music. “At the time, very few news outlets were covering the story because it occurred over a holiday weekend,” she said. To write the story, Miller hired Denene Millner, an author of more than 20 books. With interviews from Jordan’s parents, Ron Davis and Lucy McBath, the piece went viral and was one of many stories that galvanized the contemporary American movement against police brutality.
Miller started working in magazines in 2000, and came up through Honey and Jane before taking the helm at JET then Ebony in 2014. She edits for the black website theGrio when she can and writes an occasional piece for a print magazine roughly once a year. Shrinking wages have made it increasingly difficult to make a life in journalism, she told me. After working at a number of dream publications, Miller moved on to film and TV development.
Both Miller and Solomon noted how print publications have been slow to evolve. “It’s hard to imagine now, particularly to digital native folks, but print was all about a particular format. It was about putting the same ideas into slightly different buckets,” Solomon said. On the podcast Hear to Slay, Vanessa DeLuca spoke about how reluctant evolution may have imperiled black media. “Black media have not always … looked forward in terms of how to build a brand across multiple platforms.” Some at legacy print institutions still seem to hold internet writing in lower esteem (“You can look at people and be like, well, your experience is writing 1,200-word pieces for the web and you’re great at it, so good going!” were Goldberg’s words to Nieman Lab). Often, pay structures reflect this hierarchy. Certainly, the internet’s speed and accessibility have lowered barriers to entry and made it such that rigor is not always a requirement for publication. But it’s also changed information consumption patterns and exploded the possibilities of storytelling.
Michael Gonzales, a frequent contributor to this site and a writer I’ve worked with as an editor, started in magazines in the 1980s as a freelancer. He wrote for The Source and Vibe during a time that overlapped with Smith’s and Solomon’s tenures, the years now called “the golden era of rap writing.” The years correspond to those moments I spent reading magazines with my high school friends. At black publications, he worked with black women editors all the time, but “with the exception of the Village Voice, none of the mainstream magazines employed black editors.” Despite the upheaval of the past several years (“the money is less than back in the day,” he said), Gonzales seems pleased with where his career has landed, “I’ve transformed from music critic/journalist to an essayist.” He went on to talk about how now, with the proliferation of digital magazines:
I feel like we’re living in an interesting writer time where there are a number of quality sites looking for quality writing, especially in essay form. There are a few that sometimes get too self-indulgent, but for the most part, especially in the cultural space (books, movies, theater, music, etc.), there is a lot of wonderful writing happening. Unfortunately you are the only black woman editor I have, although a few years back I did work with Kierna Mayo at Ebony.
* * *
Danielle A. Jackson is a contributing editor at Longreads.
Editor: Sari Botton
Fact checker: Steven Cohen
Copy editor: Jacob Z. Gross
The American Worth Ethic

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)
“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.
The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.
Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.
* * *
Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”
Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”
It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.
If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.
Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.
Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.
That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.
* * *
The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”
The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.
These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.
Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.
Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.
If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.
We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?
We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.
Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.
Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.
* * *
The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.
The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.
While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.
Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.
Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.
* * *
Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.
Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.
We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.
Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.
The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.
* * *
Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.
Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross
Dancing Backup: Puerto Ricans in the American Muchedumbre

Carina del Valle Schorske | Longreads | April 2019 | 28 minutes (7,237 words)
Muchedumbre.
Noun, feminine: An abundance of persons or things; crowd, horde
Noun, biblical: Survivors, the chosen
* * *
When I fell for the video girl in Omarion’s “Touch,” I never thought I’d come to know her name. I loved her for her low-slung baggy jeans and spangled bustier. I loved her for the wave arranged across her forehead, her sly smile, and most of all, of course, for the way she moved. In the video, Omarion spots her with her girls as she’s leaving the club, and soon they involve each other in a pedestrian duet that elaborates the walk home along the lines of a Cuban rumba: frankly sexual, magnetically relational, but rarely, barely touching.
What won my attention was an unusual liberty in her movement — unconfined, it seemed, by a tightly choreographed routine or proper place in the staged urban environment — and a looseness in her waistline I can’t help calling Spanish. In Latin music, lyrics linger less over hips and ass, lavishing attention on la cintura atómica, la cintura sueltecita as the locus of sensual movement, maybe even the primary engine of Latin culture’s successive “explosions.” Marking the waist as specifically Spanish doesn’t check out in a diasporic vocabulary that includes wining, belly dance, even hula. But that’s how I responded to her body — with recognition. I followed the current that ran up and down her torso, briefly electrifying each gesture as if it were a spoken phrase that would resolve into a statement. I wanted to know where the meaning would land.
I didn’t expect to see this dancer again. Maybe I couldn’t see past the way she’d been cast: as a girl who appears, suddenly, in the chaos of the club, then slips back — a moment, an hour, a day later — into the city’s unsyncopated working rhythm. Blink. Touch. This was 2005, before the internet’s full power was at my fingertips, before I could feel confident that “Omarion video girl” would yield a name, a résumé, a world. I didn’t try. For years I’d return to her on YouTube, exhibiting her to friends and lovers, an avatar of erotic freedom, improvisational play, anonymous genius. I wanted her to be noticed beyond the terms the screen had set. And I wanted to be noticed for noticing her.
* * *
Pop culture teaches us that backup dancers are beneath notice. They’re not real artists, and the pleasure we take in them is primitive. They are not suitable emissaries of culture, even if culture wouldn’t be any fun without them. There are no prominent prizes for video girls, no credit roll at the end of the concert naming names. When we pick favorites and mimic their moves, our mothers make sure we know not to aspire. Backup dancing is not aspirational; it’s a no-man’s-land where brown girls are liable to languish, underpaid and overworked. It’s one wrong turn away from sex work. When Cardi B brags, “I don’t dance now / I make money moves,” she’s minimizing the difference between the kind of dancing she used to do on the pole and the kind of dancing done on other stages. Neither one, she seems to say, will pay. These messages have posed a problem for me, because I grew up in a time and place in which every Puerto Rican you’d ever heard of was — or had been — a backup dancer.
The distinction between was and had been didn’t matter that much, because the fact that certain individuals had achieved star status did little to reduce the stigma of salacious amateurism that lingered with them. Especially before Lin-Manuel Miranda, Sonia Sotomayor, and Alexandria Ocasio-Cortez went to Washington, the prototypical Puerto Rican in U.S. consciousness was [Dancing Girl emoji, skin tone tan]. Probably, she still is. Even the nation’s youngest congresswoman is haunted — or rather, refuses to be haunted — by her younger body, bopping across the rooftops of Boston University in 2010. As a dweeby tween, I wasn’t ashamed: I liked being noticed in relation to something “sexy.” But I see now why my mother was. There’s an implied analogy between the backup dancer and Puerto Rico itself, as if the island exists first and foremost for the empire’s entertainment, as if Puerto Ricans can be famous, too, so long as we know our precarious, paradoxical place.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Official policy refers to Puerto Rico as a commonwealth, but it’s really a shadow colony in plain view, hypervisible especially in relation to the colonies most Americans don’t know or name: Guam, American Samoa, the U.S. Virgin Islands. The United States government sometimes refers to Puerto Rico as “the shining star of the Caribbean,” a phrase dreamed up for a midcentury publicity campaign designed to attract business investment to the island. But this special status has not protected Puerto Rico — or its diaspora — from myriad forms of colonial extraction. Puerto Rico is both empire’s “shining star” and, in the notorious words of U.S. Senator William B. Bate, “a heterogeneous mass of mongrels,” threatening the nation’s delicate racial and political ecosystem from the shadowy margins. There are too many of us (“mass”), and each one of us already contains too many (“mongrel”). When changes in U.S. economic priorities have displaced Puerto Ricans from Puerto Rico itself, we’ve become backup bodies in cities like New York, Chicago, and Philadelphia. By the late 20th century, Puerto Ricans made up the largest “immigrant” group in New York City. Life hasn’t been much better stateside, but there is still an important sense in which the Puerto Rican pseudo-citizen moves dique freely in relation to her cousins in the rest of the Caribbean and Latin America. She won’t be deported, exactly. Instead, she’ll spin in a perpetual motion machine.
All of these myths and policies converge on the body of the Puerto Rican backup dancer. The consolation prize for second-class citizenship — really, for lack of sovereignty — has been cultural nationalism. We can shimmy and shake all we like, get loud and proud about how well we do it. But even when the backup dancer gets to be a star, she’s on the blink, appearing and disappearing like the bright spot on the nocturnal satellite map before and after Hurricane Maria.
For years I’d return to her on YouTube, exhibiting her to friends and lovers, an avatar of erotic freedom, improvisational play, anonymous genius. I wanted her to be noticed beyond the terms the screen had set. And I wanted to be noticed for noticing her.
Over the years there are certain stars I’ve come to count on, that seem to have achieved a steady glow: Rita Moreno, for example. Rosie Perez. Jennifer Lopez. Invoking them in sequence, like this, suggests a progressive history, a lineage in which I secretly attempt to situate myself. But the more I read into it, the less it feels like history and the more it feels like a cut-rate carousel. I’m stuck on the constant costume changes these women have hustled through to appear, against the backup dancer’s odds, as names we know. Despite the individuality that stardom confers, they’ve passed through many of the same institutions and come to many of the same professional crossroads. Sometimes they have literally danced in each other’s footsteps or played the same roles. They stand out from and stand in for New York City itself — Nueva York, los niuyores — a few recognizable forms in what the performance scholar Jayna Brown calls “the multijointed body of the female tableaux.” She’s talking about black vaudevillians at the turn of the 20th century, but the image translates: there’s a complex pleasure to getting lost in the crowd. Brown goes on to quote a contemporary of Josephine Baker’s: “She was just a chorus girl, baby, we all was chorus girls.” But it’s hard to hear her tone. Is the chorus girl jaded, disabusing us of the glamour we associate with the star, implying that she can never really rise above her station? Or is taking the star down a peg a way to hold her close, to include her in movement’s “we,” movement’s “all”?
* * *
Growing up, I wanted to be included — even, especially, in the mass of mongrels. I knew Senator Bate didn’t mean to make it seem like so much fun, at least not on the face of it. But by the time we get around to the 1978 Rolling Stones song “Miss You,” Mick Jagger is sure the way to sound American on R & B radio — the way to sound black — is to growl “we’re gonna come around at twelve / with some Puerto Rican girls / that’s just dying to meet you.” I liked singing along — accustomed, like women of all backgrounds, to extracting pleasure and power from pop music’s misogyny. Sometimes I still do.
Maybe I was particularly vulnerable to crude seductions because our family was the opposite of a crowd: me and my mother in California, my grandmother in New York, no siblings, no husbands. Until I left the Bay Area for New York when I was 18, my direct relatives were the only Puerto Ricans I really knew. I was grateful for my Chicanx friends at the private schools we attended on scholarship — we began our political lives together — but culturally speaking they didn’t really know where to place me, and I wasn’t in a position to help them. If Jennifer Lopez implied an urban world teeming with around-the-way girls and spontaneous block parties, I was eager to be implicated.
In Zami, Audre Lorde’s erotic memoir, she articulates her mother’s longing for her natal island of Grenada: “She missed the music you didn’t have to listen to because it was always around.” When my mother danced around the apartment it became populous — with stories of her father’s famous footwork, Motown madness with her college boyfriend, José, the live drums from the New Rican village that seemed to fall in line behind her heels. We’d angle out the closet door with the full-length mirror so she could teach me her teenage moves: the Mashed Potatoes, the Watusi, the Jerk. And then she’d spin out where I couldn’t follow, spurred into a frenzy by the telltale cowbell in “Adoración.” She was multiplied at both ends: by everything that entered her and everything her dancing made me do, the movement she started in the living room. A culture of one. Given our isolation, it would take me years of living in New York to discern which of my mother’s gestures and behaviors were the product of her powerful personality, and which were Puerto Rican cultural commonplaces. It isn’t always easy, or explanatory, to name the difference.
In her self-titled memoir, published in 2011, Rita Moreno remembers moving to Washington Heights and “sitting on the wrought iron grille base beside an open window … while our new radio, shaped like a small cathedral, blared music to me and to any other appreciative Latinos within earshot.” With neighbor girls she “put on costumes and spun through living rooms [and] even ‘entertained’ on the rooftop.” Rosie Perez credits her early dance training to the long summers she spent with her cousin Cookie “in a dilapidated tenement that she kept clean as hell … doing the Hustle in the kitchen while my wet set dried.” I wonder if we’d call it training if we never came to see her dance on TV. Was I training, too, for the pedestrian life I have, in which I’m only famous for my dancing among the friends who follow my Instagram stories? For my gracelessly improvised life as a writer?
‘She was just a chorus girl, baby, we all was chorus girls.’
The New York I live in now is more densely Caribbean than it was when Audre Lorde’s mother suffered the unmusical noise of the north. Despite the city’s constant war on public space, the air at least stays thick, stays wavy. These days the uptown bodegas play bachata, and when I walk by I like to let it inflect the rhythm of my walking — the music I don’t have to listen to because it’s everywhere, the dance I don’t have to do because it’s always in my body. It’s a trope of black diasporic dance to start small, as if walking, as if merely shifting weight, hitching a skirt — the better to dramatize the smooth continuum between everyday life and the high fever of the mess around.
My mother sometimes worries about the way I walk, especially in Washington Heights, where my grandmother lives. She migrated — pregnant with my mother — 15 years after Rita Moreno, in what historian Lorrin Thomas describes as “the postwar boom … that nearly doubled New York City’s Puerto Rican population in two years.” We’ve come to call it “la gran migración,” taking a cue — as we often do — from African American history’s Great Migration from the rural South to the urban North. I still visit my grandmother in the same neighborhood — the same building — where my mother grew up.
And yet it isn’t the same. I was born post-crack and post-Reagan, so our block has always been that kind of hood to me. Now it’s gentrifying. I admit wishing we could keep the ancestral apartment, somehow, so I could live there with rent control. But she doesn’t think I understand the danger. Around here, Latinas are always the ones hit hardest by street violence, she says. I don’t know whether I am, in this case, her daughter or the daughter of my gringo father. So I ask. She thinks the corner boys can tell I’m Latin like them: You can’t do anything about the way you move. In the heat of conflict I feel a pleasurable frisson: the transmission alive in me. I wouldn’t wish that way out of my body, because I wouldn’t wish my body away. It feels safer, somehow, to stay close to my mother even when she says it isn’t.
I know that standing out can pose its own dangers, depending on how and among whom. Cue Zora Neale Hurston: I feel most colored when I am thrown against a stark white background. The image evokes the police precinct’s mugshot as vividly as the museum’s gallery wall. I also know that being singular — or at least, the idea of being singular — has mattered to both my grandmother and my mother because it’s mattered to their survival. Moving — out, away, up from poverty — is often easier alone, dissociated from the trope of the hungry horde. But even loneliness has a lineage, and I find myself feeling for it.
* * *
Rosita Dolores Alverío was not technically an only child; her mother had abandoned her younger child, a boy, when they migrated from Juncos, Puerto Rico in 1936. But in the wake of this desperate choice, Rosita was raised like one, with the intensity of attention I recognize from my mother’s only childhood and my own. Focusing on one child mitigates the economic limitations of working-class life — and of course, raises the stakes for a return on investment. Even by the impossible standards of an immigrant mother, it’s safe to say that Rosita made good as Rita Moreno, the first Puerto Rican to become a bona fide star in the United States. She’s won all four major prizes in American entertainment — the Oscar, the Grammy, the Emmy, and the Tony — and her 1962 Oscar for Best Supporting Actress as Anita in the musical West Side Story remains the only Oscar ever awarded to a Latina performer.
Over time, this distinction has become a bitter sign of how tightly U.S. culture seeks to control our conditions of appearance. But in her memoir, Rita conveys the animating thrill of matriarchal ambition that first set her spinning onstage as a child dancer. In certain moments, her descriptions of their shared labor sound almost utopic:
A happy home has its own music. The house hummed with Mami’s Singer sewing machine as she worked the foot treadle. This machine was so old; it was not an electric model. All the energy came from Mami, from her foot tapping and rising and falling. It sounded like the roll of a Spanish rrrrr! As if in accompaniment, I danced in time with its pulsing, while Mami was creating headdresses and costumes for me.
I didn’t demonstrate enough talent in ballet class to warrant such a scene, but my mother did make our home into a kind of studio, ready for whatever talent might emerge for cultivation. In the “happy” immigrant home, work and play are closely intertwined by necessity. Work must become play, or play must become work, if play is to survive as a vital practice. Like my grandmother, her sisters, and the majority of Puerto Rican women immigrants to New York City, Rita’s mother first worked as a factory seamstress. At home, she turned these same skills to the fanciful project of imagining new and dramatic ways for her daughter to appear. Rita was the chosen channel for this form of dreaming, but the dream itself was more general: to produce, with the means of production at hand, a range of possible lives and the freedom to move among them.
When the doors of Hollywood opened for Rita Moreno, they didn’t open for all her possibilities. They opened for a Slave Girl, an Indian Princess, a Dusky Maiden. It was one role, really: the temporary romantic interest of the white leading man led astray by her temptations before settling down with a suitable (read: white) wife. Who can blame Rita Moreno, then, for her profound ambivalence about so-called stardom? “Cold feet” kept her from auditioning for the principal role of Maria when West Side Story was on Broadway, and her anxiety persisted even after she secured the supporting role of Anita in the film adaptation. Though Anita animated contemporary anxieties about New York’s “Puerto Rican problem,” the role was also substantial, a rare opportunity she was sure she’d somehow squander: “A shadow followed me like a backup dancer, making me worry that it would only be a matter of time before I would lose everything.”
There she is: the backup dancer, making a cameo here as a sly, flexible metaphor. If Rita’s shadow is the backup dancer, then Rita herself is surely the star. But the metaphor seems to articulate the slippage between the two positions — the backup dancer is the star’s shadow side, the constant reminder of how precarious her visibility really is. She’s on her heels, grabbing hold wherever her body touches ground. Maybe Rita felt shadowed by the roles she’d been forced to play, unable to get out from under the sense of herself as an erotic extra. Or maybe she couldn’t escape the sense that her luck would always come at someone else’s expense: she was keenly aware of replacing another Puerto Rican dancer, Chita Rivera, who’d triumphed as Anita on Broadway. She was convinced she could “never, ever be as good as Chita,” that she’d never deserve the power of her position.
She was multiplied at both ends: by everything that entered her and everything her dancing made me do, the movement she started in the living room. A culture of one.
But if the backup dancer haunts the star, she also keeps her company. “Rita the Cheetah,” as she was known in the press, would never be lonely as Anita: the role activated a rhyme of substitutes, a small crowd of Puerto Rican hopefuls passing in and out of the spotlight. In fact, Rita deliberately “sought out a friend who had played the part of Anita on a coast-to-coast tour,” eager to learn a few steps for her audition. Every dance begins in — as — someone else’s shadow. That’s just how it is. However singular her performance would turn out to be, Rita became Anita in relation to the other women who had been her. A gang of Anitas gave birth to Rita’s Anita, the gang leader.
Ultimately, it is Anita, with her active — if contentious — relationship to group identity who is West Side Story’s brightest star. It is Anita, not Maria, who seems to summon the whole urban world into being with a swirl of her purple skirts and a clap of her hands: “Here,” said the New York Times review, “are the muscle and rhythm that bespeak a collective energy.” When I imagine a world ruled by Anitas, I get a festive feeling, as if I’m climbing the fire escape to the famous rooftop scene. I can almost smell the summer-softened tar, the beer going flat, the perfumed sweat rising as banter becomes music, becomes, suddenly, a dance battle. Maybe there’s a way to wiggle free from our collective confinement without leaving each other behind. Maybe there’s a way to argue over what “America” has made of us in our own language.
From the rooftop, these dreams seem don’t seem so far off. But in her memoir, Rita Moreno asks us to stay with her in closer quarters, to find freedom in a scene where her only company is her own shadow, in a moment that’s not right for shimmying. In one of West Side Story’s most tragic turns, Anita leaves Sharks turf to deliver Maria’s message to Tony, only to be intercepted by the Jets:
When I had to play the attack scene in the candy store, I wept and broke down— right on set. It was that incredible, amazing, magical thing that happens sometimes when you’re acting and you have the opportunity to play a part so close to your heart: You pass through the membrane separating your stage self from your real self. For a time, at least, you are one person.
The “attack scene” has always been understood as an implied gang rape, which heightens the intensity of her language in this passage: why should inhabiting a scene of traumatic violence be “incredible, amazing, magical,” a restorative moment of contact with her “real self”? Trauma is usually narrated using exactly the opposite vocabulary: splitting, sundering, shattering. But for Rita Moreno, to break down is to return to a truth about her experience in the industry that her usual performance of resilience obscures: being singled out for special treatment by Hollywood’s power players had a shadow side.
Rita’s first sexual experience was what she later came to recognize as rape by a man who claimed to want to work as her agent. Immediately after the filming of West Side Story, her long-running, emotionally abusive affair with Marlon Brando would drive her to attempt suicide. Of course, these biographical details do not exactly correspond to the violation implied by the candy shop scene. Rita was never a Puerto Rican gangbanger; her working-class Washington Heights was more like my mother’s than Anita’s. And yet, the projection of these fantasies onto her body — the stereotype of her body as essentially available, disposable, and replaceable — put her in the way of real violence, mostly at the hands of white men. Becoming a star required a dangerous risk: leaving her own turf for the way her turf was rendered in show business. The candy shop wasn’t real to Rita, but the candy shop scene did feel real, with its crowd of white men curtailing her movement with threats and demands. This time, she did not have to hide her fear and anger for the sake of her career; she could dance with them.
There’s a moment in Peter Pan when Peter’s shadow runs away and Wendy intervenes to carefully stitch it to the soles of his feet: a woman’s work. I think of Rita in West Side Story as her own Wendy, mending her relationship with the shadow that would follow her everywhere in the Neverland of American show business. It’s another kind of costura, more painstaking, maybe, than the dreamwork that produced her first costumes. Here, her desire to be “one person” is not the same as a desire to escape alone, to escape intact. Instead, it reflects the difficult knowledge that she is one person only when she can bear to incorporate the parts of herself she’s disavowed.
* * *
In an interview from 1998, Jennifer Lopez refers to Rita Moreno as “the original Fly Girl,” naming her the inadvertent matriarch of the Fly Girls featured on Keenen Wayans’s hip hop driven variety show In Living Color, where Jennifer got her first big break. She shifts the focus from Rita’s moment of semi-stardom as Anita to imagine her in relation to a small collective of dancers, most of whom did not move on to fame and fortune. It’s a complicated gesture, elevating the Fly Girls by saying they have a history while at the same time pluralizing Rita’s individual achievement. She was just a chorus girl, baby. We all was chorus girls. Every genealogy of Puerto Rican performers — including the one I’m moving through in this essay — will be intimate, idiosyncratic, and provisional. But if we’re talking about the Fly Girls, specifically, it’s fair to feel like someone’s missing.
In large part because of the narrative of competition forced upon them as two Puerto Rican stars in generational proximity, Jennifer Lopez has never been very good at publicly acknowledging her debt to Rosie Perez, the In Living Color choreographer who lobbied to make her a Fly Girl in the first place. I think a lot of Latinas who came up with and through hip-hop are just beginning to see what Rosie meant to us — to mend, like Rita with her shadow, the disavowal that has often accompanied our admiration. DJ Laylo, a Bronx Dominicana, put it this way in an interview with Remezcla: “It’s a little bit of a sore spot for me because whenever I’m in predominantly white spaces, I always have people coming up to me saying, ‘Oh my god you sound like Rosie Perez.’ And I know they don’t mean it because they’re paying tribute to all that she is.”
My mother was the first one to introduce me to Rosie — we checked out Do the Right Thing from the library on VHS — but she, too, was plainly unsettled by Rosie’s accent, which she insisted had been exaggerated to make her seem Extra Rican. The theory wasn’t far-fetched; Rita was made to invent an accent she didn’t have for West Side Story. But I wasn’t really listening to my mother’s critiques. I was too mesmerized by the film’s famous opening credits — red lights, then blue — which find Rosie pumping her chest and throwing hooks in front of Brooklyn brownstones to all four minutes of Public Enemy’s “Fight the Power.” Whatever she was fighting I felt like I was fighting too, including my own resistance to her performance. Recently I’ve been asking friends how they remember feeling about the scene back in the day. The word “unapologetic” keeps coming up, which makes me wonder what — and who — we’ve grown accustomed to apologizing for. My friend Christina’s take is a little more specific: “She seemed like she wasn’t afraid of men.”
I can almost smell the summer-softened tar, the beer going flat, the perfumed sweat rising as banter becomes music, becomes, suddenly, a dance battle. Maybe there’s a way to wiggle free from our collective confinement without leaving each other behind.
In some ways, history supports Christina’s formative impression. In several interviews, Rosie recounts how she first met Spike Lee at the L.A. nightclub Funky Reggae, where he was hosting a big booty contest to promote School Daze. Rosie wasn’t having it; she’d come to the club to dance: “disgusted…I jumped on the stage — okay, so it was a speaker — and bent over shaking my ass.” It’s a parable of her performance philosophy: the speaker becomes the stage as she insists upon her objection to performance as part of the performance itself. When Spike’s bouncers came through to pull her skinny butt back down, the young director decided he liked that trash-talking Brooklyn Rican. He picked her out from the lineup and gave her an on-screen solo.
It would be a merciless eight-hour shoot that gave Rosie swollen knees and tennis elbow: he solicited the anger she’d once directed at him and worked it to the bone. It’s not an endorsement of his abusive techniques as a director to say that in the final cut her anger seems to exceed its conscription to become the sign and symbol of the borough’s unrest. In a movie that centers on the political struggles between black and white men in the world of work, that cannot imagine a role for anyone else in the battle for representation in the face of racist violence, it is a Puerto Rican woman’s persistent and plotless physical practice that frames the narrative. Who or what is her adversary as she trains for a fight we never see go down onscreen? We can’t call it. The block, the pizza parlor, the movie set itself — the site of struggle is always changing. Rosie is slick with the sweat of staying ready wherever it finds her.
Part of the reason I find myself saying “Rosie” instead of her character’s name, “Tina,” is because the scene unfolds in a liminal space between our world as spectators and the world of the film, where the story has yet to be told. When Do the Right Thing first came out, the conservative critic Stanley Crouch complained in the Village Voice that the scene was “amateurish,” nothing more than a music video. He’s wrong to complain, but right to see it like that. Rosie isn’t really Tina yet, she’s Rosie, recognizable if you know her from Soul Train, and just a Puerto Rican girl dancing if you don’t. Soul Train’s practice of using amateurs to bring the energy of the street to the screen was being developed in new directions by MTV, and Spike Lee was making major contributions to the same culture. He wasn’t the first one to cast Rosie Perez from the club floor; her “realness” had become a hot commodity in the emerging hip-hop economy. Of course, someone like Stanley Crouch was never gonna get Rosie. But his critique magnifies an anxiety about her performance shared by those who thought they did.
Soul Train’s director, Don Cornelius, liked Rosie so much that he had her dance down the line twice on her first night on set. She was out of place — a Puerto Rican in Los Angeles — which made her stand out, trigger a double take. Her light skin and tight little body gave her immediate mainstream market value. But the way she moved and spoke from within that body also seemed to threaten the investment. “Is that your real accent?” Don Cornelius asked the first time he heard her speak, turning an invisible dial down. In her 2015 memoir, Handbook for an Unpredictable Life, Rosie remembers: “Don Cornelius did not want to see how I really danced,” anymore than he wanted to hear how she really spoke.
On Soul Train Rosie was always trying to do the moves she’d learned back in the city: the Pee Wee Herman, the Roger Rabbit. At New York clubs like the Roxy and the Latin Quarter she had her eye on the male dancers “behind Whodini and Big Daddy Kane … all doing James Brown, Bill ‘Bojangles’ Robinson, and the fabulous Nicholas Brothers moves, making them their own.” Don’s early objections to Rosie’s dancing took the form of gender management: “Nononono, you’re a girl!” Of course, the (imagined) friction between her conventional femme sexiness and her hip-hop intensity is what gave her performances heat. If her body was disciplined in a satin miniskirt, stockings, and a waist-cinching belt, her face was not: that self-possessed sneer. Louie Carr — “Cutty Mack” — remembers Rosie as “aggressive and sexy and a little street, like a machine gun.” Don Cornelius wanted the rhythm of the weapon without the war.
Don’s struggle for control over Rosie — and here, he’s only an example — reveals the risk inherent in the aesthetics of realness. A musical like West Side Story was exciting, in its time, because it suggested an intimate relationship between the singing and dancing on-screen and the changing demographics of the city itself. Rita Moreno, the only actual factual Puerto Rican with a speaking role, was the linchpin of that seductive suggestion. In the plot, her dancing always starts a debate, a competition, a party. It always demands a reply. The delight we take in her call-and-response virtuosity implicates us in the project of imagining an urban world we can all inhabit. But the industry only let the provocation of Rita Moreno’s performance go so far. It didn’t matter that she mastered the choreography. That she waited her turn for dignified, complicated starring roles that never came. That she wore a white pleated skirt to the March on Washington. The game had rules for a reason: to make sure it never got really real.
But by the time Rosie Perez was born, whatever remained of the American Dream for Puerto Ricans was dead, and she was too black and too busy trying to survive an abusive childhood to play along. Rosie’s New York was post-Civil Rights: the War on Drugs had replaced the War on Poverty, and the collective trauma of ghetto life had already yielded several generations of black-brown collaborations including bugalú, salsa, and the beginnings of hip hop. White institutions were no longer the only gatekeepers crafting and legislating the representation of urban culture. Rosie’s class position and her historical position intersected to make it clear that she wouldn’t, couldn’t, and shouldn’t have to assimilate out of the world that made her.
Don Cornelius, with Soul Train, was a major player in that transformation. Starting in 1971, he opened the door to the creative power of regular-degular city kids, who brought their own bell-bottoms and hustles to set, collectively forming the living, breathing backdrop for some of the most iconic black performances of the ’70s and ’80s. But on Soul Train the backdrop was the real show — not the celebrity guests who mostly lip-synched anyway. The young dancers pulsed behind the permeable membrane of the screen. And on the other side the rest of us joined the party, turning the TV into a magic mirror. A girl who could be your half sister is doing the dance you do in the front yard on Sundays, and she’s making it famous. Next time, it could be your actual half sister. Next time, it could be you. In providing a major cultural platform to kids who rarely received the message come as you are, Don Cornelius modeled the possibility of an equivalent political platform.
In a movie that centers on the political struggles between black and white men in the world of work, that cannot imagine a role for anyone else in the battle for representation in the face of racist violence, it is a Puerto Rican woman’s persistent and plotless physical practice that frames the narrative.
But he also exploited the Soul Train dancers. Rosie remembers: “We didn’t get paid, just a Kentucky Fried Chicken two-piece lunch box — not kidding.” The prestige economy forced the dancers into a frenzy of competition, like “piranhas at feeding time.” Don Cornelius — and the other impresarios who followed in his footsteps — wanted to let in the feel of freedom, but carefully calibrated to align with market protocols and the agenda of their own enrichment. That’s life under racial capitalism, beibi. If he let Rosie move however she wanted to move, she might roll up the next night with her entire hip hop block demanding a living wage. On the other hand, if he didn’t, she might leave. One night, that’s what she did:
I walked back to the head of the line, paused, then strutted down as if I were Naomi Campbell on the runway, continued walking past Don to my seat, grabbed my things, and told him I was out.
It takes a special kind of grace to perform and stop performing in the same seamless gesture. The Soul Train line always pointed beyond the station; Rosie’s secret weapon has been her willingness to leave. In a 2017 interview with Desus and Mero, Rosie states it plainly: “I didn’t wanna be [in show business], so I wasn’t afraid of not getting a job. I was like, fuck this shit, I’m smart, so fuck y’all.” Almost nothing is more threatening to the star system than divestment from it. The star system often functions as an imperial structure of containment, a way to manage the unruly energy of a muchedumbre whose festivities incubate a revolutionary impulse. The Puerto Rican poet Luis Palés Matos warned everybody back in 1937: si … te picara un tambor de danza o guerra / su terrible ponzoña / correrá siempre por tus venas. Translation: if … you’re pricked by the drum of dance or war / that terrible poison / will run forever through your veins. This kind of inheritance doesn’t care who your mother is. This kind of inheritance could go viral.
* * *
Over time I find myself feeling disappointed in Jennifer Lopez, and this might be the moment to ask myself why. It’s a refrain among Puerto Rican women I know to say girls like that are a dime a dozen in my neighborhood. My mother says it, too — that her cousin Carmencita was more beautiful, with her heavy winged eyeliner and languorous way with a pencil skirt. Eyes like black coffee trembling in a cup. I’m not sure if we say so because we’re ashamed that she’s regular — the wrong one to represent our culture’s repressed powers — or if we’re ashamed that we’re regular, too, but without the will to say so what? Jennifer Lopez never claimed to be the most talented girl in the room. In her infamous 1998 interview with Movieline, she said, “I’m not the best … that ever lived, but I know I’m pretty good.” Being humble, for her, has never required being hidden — as we so often assume it must.
But Jennifer’s mediocrity is not the source of my disappointment. I don’t care that she can’t sing, or that she’s just okay at dancing. When I think about the fact that Keenen Wayans refused, at first, to hire her as a Fly Girl — “called her chubby and corny” — I’m grateful to Rosie for fighting for that “big-ass beautiful girl from the Bronx” with the “star smile.” I like the footage from that period, especially a little promotional clip for Janet Jackson’s “That’s the Way Love Goes” where Janet introduces her new dancers as “Jennifer, Shawn, and Nicky: three backed-up hoes!” It’s fun to watch Jennifer fire back, “Honey we’re here to wreck shop, what’s your problem?” Taken literally, the idiom suggests the end of buying and selling, the general damage “backed-up hos” intend to do with their dancing.
If these are the moments I love best, then maybe I’m less disappointed in Jennifer Lopez than I am in the nature of stardom itself. She’s achieved what long seemed impossible for a Puerto Rican performer: race-blind roles, multimillion dollar paychecks. But that doesn’t do anything to make me feel like part of an us. Her stardom feels far-off and joyless. When I try focusing on recent interviews with her, my eye always wanders from YouTube’s main screen to the little stack of further possibilities waiting in the wings, and I can’t resist clicking aimlessly. I’m more interested in the algorithm of associations than the record of any single personality.
That’s how I spot her: Omarion’s video girl, in a red crop top, striped shorts, and gold sneakers, dancing with Bruno Mars in the January 2018 video for “Finesse.” It’s a tribute to In Living Color, and Danielle Polanco — this time I can say her name — is the Fly Girl the camera loves best, leaning out from the fire escape with her girls to call down to Bruno and his boys, a Tony-and-Maria moment made plural for our pleasure. The family tree has many branches: later I learn that she danced backup for Jennifer Lopez, Janet Jackson, and Beyoncé, that she was the dance captain for the Broadway revival of West Side Story. She played Consuela, an even smaller role than Anita — a backup dancer’s backup dancer. Now, the core of her career is teaching boutique classes: “Heels” at Alvin Ailey Extension and Millennium, “Vogue Femme.” Virtuosity is not what determines a dancer’s destiny in the studio as opposed to the spotlight, and I don’t find myself wishing Danielle Polanco were a star just because I could watch her dance all day. Genius has no proper place. Insisting on the absolute distinction between genius and mediocrity drags the party down; it disrupts the circulation of genius itself.
Maybe that’s why Rosie Perez felt weird when she went to the club with her friends from Soul Train and people pointed, stared: “Look, it’s the Soul Train girls!” Just a few years earlier Rosie herself had been the random amateur scouted from the crowd. What had changed, really? The club was still her home haunt, the uncanny valley between amateurism and stardom where her career played out. It’s not hard to imagine all the other Rosies on the dancefloor who’ve remained undiscovered, but still manage to steal the show when the beat drops. Then there’s the rest of us, shoulder to shoulder, an undulating wave of body heat that breaks, now and then, into open conflagration.
Genius has no proper place. Insisting on the absolute distinction between genius and mediocrity drags the party down; it disrupts the circulation of genius itself.
* * *
Three years ago in Brooklyn a new DJ night was born, spinning salsa and reggaetón and trap en español: “A Party Called Rosie Perez.” It’s organized by Christian Martír alongside DJ Suce and DJ Laylo, the same woman who bristled when the wrong people projected a resemblance. It’s gotten hot: when my friend Cassandra went, she spotted Residente from Calle 13. The first time I go, Bobbito Garcia, the legendary hip hop DJ, is at the turntables and I’m dancing with my friend Yohanna while a video projection of Rosie on Soul Train plays on the club wall. Now and then someone bumps the shaky projector and Rosie’s head gets cut off, so she looks like a doomed chicken flapping through her final bravura performance. I can see the bright shadow of her younger body pass over Yohanna’s, Rosie’s rapid pumping playing a polyrhythm over Yohanna’s more relaxed step and slide. Since we’re the party, are we Rosie Perez? Alive and moving inside the space her body’s made? The visual effect allows me to imagine that it’s possible to dance in someone’s footsteps without replacing her. To channel someone’s spirit without making her a ghost.
My reverie is interrupted when a young white boy dancing next to me taps on my shoulder and points to the screen, shouting who is that? In America, I remember, you can immerse yourself in Puerto Rican culture without knowing it. Without ever naming a name. Months later, I think of this moment while reading La raza cómica by the scholar Rubén Ríos Ávila, who offers some counter-questions: What is pleasure worth if it cannot be deciphered? What is the joy of dance good for if we can’t know its point of origin?
I understand the impulse behind the Party as my own: a form of feeling for history. In the absence of something so static or simple as a point of origin, a name is a portal — a way into the crowd as well as a way out of it.
When I leave the club my body’s still buzzing. For a moment I think I see Danielle Polanco, striking a pose on the subway platform. Up close, I see she’s just another cinnamon girl with a high bun and hoops whose skin is dewy from the sweat of a summer night. But I can’t help feeling we’re both backup dancers. Any sudden movement might start a number. We might already be in a number without knowing it, an elaborate social production we didn’t design, roles we didn’t choose, and for which we are probably not being properly compensated. But as backup dancers we’re always ready. Are you?
* * *
Carina de Valle Schorske is a writer and translator living between New York City and San Juan, Puerto Rico. She is currently at work on her first book, a psychogeography of Puerto Rican culture, forthcoming from Riverhead and tentatively titled NO ES NADA: Notes from the Other Island.
Editor: Danielle A. Jackson
Copy editor: Jacob Gross
Fact checker: Ethan Chiel
Taming the Great American Desert

John F. Ross | The Promise of the Grand Canyon | Viking | July 2018 | 24 minutes (6,540 words)
In April 1877, the normally staid proceedings of the National Academy of Sciences’ annual meeting in Washington took a dramatic turn. For two weeks, members had listened to the nation’s most distinguished scientists speak on topics ranging from lunar theory to the structures of organic acids. Members enjoyed “Results of Deep Sea Dredging,” by the son of the recently deceased scientist Louis Agassiz. The Academy had invited G. K. Gilbert to deliver a paper, “On the Structure of the Henry Mountains,” so named in honor of the Academy’s president by Powell’s survey. On the final day, the geologists took the floor, whereupon erupted a furious discussion of the American West. The rub lay between those who studied the fossils and those who examined the rock strata, each drawing wildly different conclusions about the age of their subjects.
Such was the fervor of the discussion that the geologists soon jumped to their feet in animation and anger. “[W]hat they might do if they once went fairly on the rampage, it is impossible to say,” wrote one correspondent. Hayden rose to argue that no great degree of difference existed between the two sides, but others immediately shouted him down.
Yet while the rather scholarly debates over dating and provenance might animate the geologists, that day would be remembered not for these petty theatrics, but for an address Powell delivered. In it, the Major stepped away from the fields of geology and out of academic realms to address a topic that pressed right to the heart of American democracy. During the Townsend Hearings three years earlier, he had raised the issue of the West’s extreme aridity and the difficulty of irrigating much of it — but he had thought a lot more about it since then, and the map he now unrolled in front of America’s top scientists carried startling implications. He had bisected the map of the nation from Mexico to Canada with a vertical line rising from central Texas up through Kansas, east of Nebraska, and through Minnesota, roughly approximating the 100th meridian. At this line the arid West begins with startling consistency, the tall prairie grass cedes to short grass and less fertile soils. Trees appear rarely west of the line, except at high altitudes and in the Pacific Northwest, while forests dominate the east: The 100th meridian elegantly divides two separate lands, one composed of wide horizontal vistas, so much of the other defined by its vertical prospects.
The land west of the 100th meridian, Powell announced, could not support conventional agriculture. Surprise met this bold statement, for the line clearly indicated that much of the great plains — including all of Colorado, Montana, Wyoming, and Idaho, plus Arizona and New Mexico — was essentially unfarmable. Here was the professor at his best: clear, authoritative, dramatic. He had everyone’s attention.
Powell had drawn an isohyet, a line connecting areas that experience equal volumes of annual rainfall. The relatively humid lands to the east of this line experience twenty or more inches of annual rainfall, the unquestionably arid lands to the west receiving less than that, except some narrow strips on the Pacific coast. The twenty-inch isohyet offered a valuable generalization — conventional agriculture simply could not work without twenty or more inches a year, unless supplemented by irrigation. Except for some lands offering timber or pasturage, the far greater part of the land west of the line was by itself essentially not farmable. Access to the transformative powers of water, not the availability of plots of land, proved a far more valuable commodity. By now, any land through which streams passed had all been acquired, some of these owners charging those less fortunate for irrigation water. “All the good public lands fit for settlement are sold,” Powell warned. “There is not left unsold in the whole United States land which a poor man could turn into a farm enough to make one average county in Wisconsin.”
Much of what Powell reported was not exactly new, but no one had presented the data so comprehensively and convincingly — and not anyone so famous as the Major. Few, of course, doubted the region’s aridity. But in one powerful moment, Powell had claimed that the nation’s traditional system of land use and development — and thus America’s present push west — simply would not work. The debate that Powell provoked that late April day drew immediate and blistering response. The land agent for the Northern Pacific Railway, itself the beneficiary of a government grant of nearly four million acres, hammered back at Powell’s “grave errors.” “[P]ractical farmers, by actual occupancy and cultivation, have demonstrated that a very considerable part of this ‘arid’ region, declared by Major Powell as ‘entirely unfit for use as farming lands,’ is, in fact, unexcelled for agricultural purposes.” Others responded similarly. Powell clearly had touched a raw nerve. Over the next several years, he would have much more to say on the matter, igniting a veritable firestorm. While the other surveyors limited themselves to covering as much ground as possible, Powell now wrestled with the startling implications for the ongoing development of the West — and what that meant for the American democracy he had fought so hard to save.
***
For most of the first half of the 19th century, eastern America’s conception of the western portion of North America could be spelled out in three words: Great American Desert. That originated during the Long Expedition of 1819, when President James Monroe directed his secretary of war to send Stephen H. Long of the U.S. Army Corps of Topographical Engineers with a small complement of soldiers and civilian scientists on a western reconnaissance. Secretary of State John Quincy Adams had just negotiated a treaty with Spain that ceded Florida to the United States and drew a border between the two countries running across the Sabine River in Texas, west along the Red and Arkansas rivers, and all the way to the Pacific. Eager to know more about the border and the new western territory, Monroe had the secretary of war direct Long to follow the Platte River up to the Rocky Mountains, then trace south and back east along the new border.
The energetic New Hampshire–born West Pointer envisioned himself the successor to Meriwether Lewis and William Clark — indeed, over the course of five expeditions, he would cover 26,000 miles, and mount the first steamboat exploration up the Missouri into Louisiana Purchase territory. His name would grace the peak that Powell was first to climb. On this expedition, Long split his group into two, sending one party along the Arkansas while he with the rest headed south to chart the Red River. Long’s men, often parched and starving, battled a violent hailstorm, sometimes resorted to eating their horses, and negotiated their way past a band of Kiowa-Apaches. But the maps they carried were so atrociously inaccurate that the river they followed for weeks was not the Red at all.
***
Three years after Long’s party returned home, expedition member Edwin James published the three-volume Account of an Expedition from Pittsburgh to the Rocky Mountains. Long’s ordeal imbued him with little affection for the “dreary plains” they had traversed. The Great Plains from Nebraska to Oklahoma he found were “wholly unfit for cultivation and of course uninhabitable by a people depending on agriculture.” He added: “The traveler who shall at any time have traversed its desolate sands, will, we think, join us in the wish that this region may forever remain the unmolested haunt of the native hunter, the bison, the jackall.” The accompanying map labeled the area a “Great Desert,” terminology that soon fully flowered into the “Great American Desert,” a colorful appellation that would stick to the indefinable sections of the West for the next generation. Long believed that this desert wilderness served as a natural limitation on American western settlement, acting as an important buffer against the Mexican, British, and Russians, who claimed the western lands beyond. That compelling assertion seemed to resonate in the public imagination, locking into place the notion of a vast desert dominating the nation’s western midsection. “When I was a schoolboy,” wrote Colonel Richard Irving Dodge in 1877, “my map of the United States showed between the Missouri River and the Rocky Mountains a long and broad white blotch, upon which was printed in small capitals THE GREAT AMERICAN DESERT — UNEXPLORED.
Even though some early trappers and mountain men had brought back word of a land often far from desertlike, the idea persisted. In 1844, when U.S. naval officer Charles Wilkes published his five-volume Narrative of the United States Exploring Expedition, it included a map of upper California. Inland from the well-detailed Pacific coast lay the Sierra Nevada, while the front range of the Rockies marked the map’s eastward extension. In between the ranges lay a vast, wedge-shaped blank space, without a single physical feature delineated. Unable to leave such a realm blank without remark, Wilkes had inserted a simple paragraph reading “This Plain is a waste of Sand. . . .” Like the sea monsters inhabiting the unknown sections of medieval maps, he — like Long — had condemned the entire region, the dead space not even worthy of a second look. Eleven years later, a Corps of Topographical Engineers map had sought to add additional detail, but could only insert a tenuous dotted line that indicated some cartographer’s wild guess about the Colorado River’s course.
Cracks started appearing in the notion of a Great American Desert during the early 1840s expeditions of Charles Frémont, son-in-law of that powerful advocate of Manifest Destiny, Senator Thomas Benton. With his backing, Frémont led both a four-month survey of the newly blazed Oregon Trail in 1841 and an audacious fourteen-month, 6,475-mile circuit of the West, beginning in 1843. Frémont’s subsequent reports combined a deft mix of hair-raising adventure with scientific discovery, thrilling its readers with images of guide Kit Carson and the so-called Pathfinder himself running up a flag atop a vertiginous Rocky Mountain peak. The maps accompanying the reports furnished emigrants with an accurate road map for the journeys that thousands would take west in the 1840s and 1850s. Frémont’s reports indicated that the intercontinental west certainly contained stretches of truly arid land, but that it was no unbroken Sahara. Yet even so, the pioneers and gold seekers understood that great opportunities lay not in this parched region, but beyond, at the end of the trails, in Oregon and California. Most of the West still remained no more than a place to get across.
In the late 1850s, a rather startling shift had turned the idea of the Great American Desert on its head. “These great Plains are not deserts,” wrote William Gilpin in a late 1857 edition of the National Intelligencer, “but the opposite, and are the cardinal basis of the future empire of commerce and industry now erecting itself upon the North American Continent.” Gilpin, the electric-tongued son of a wealthy Philadelphia Quaker paper merchant, would do more than any other single individual to persuade his fellow citizens that America’s great midsection was a garden only waiting to be plowed. Whereas the term Manifest Destiny had been coined as a justification for conquering great swaths of the continent at gunpoint, Gilpin transformed it into a more wholesome interpretation that pulled peoples across the nation. It also had the weight of the Enlightenment’s commandment, articulated by philosopher John Locke that God and reason commanded humans to subdue the earth and improve it. As Civil War soldiers returned home, all America could climb on board with Gilpin’s fantastical promises, any threatening idea of a great desert now disregarded. He had given America what it most wanted to hear: the promise that its growth was unlimited, its western lands a never-ending buffet of opportunity and growth, limited only by a lack of imagination and courage.
Gilpin had impressive credentials: Not only had he joined Frémont and Kit Carson on their expedition to Oregon in 1843, but as an army officer he had fought the Seminoles in Florida, served as a major in the First Missouri Volunteers during the Mexican War, and marched against the Comanche to keep the Santa Fe Trail open. A columnist for the Kansas City Star observed that “his enthusiasm over the future of the West was almost without limitation.” He became a disciple of Alexander von Humboldt, the great German geographer, who published the early volumes of his Cosmos in the late 1840s, elaborating the thesis that geography, climate, and biota incontrovertibly shaped the growth of human society. Gilpin pressed the Humboldtian idea that much of North America lay within an Isothermal Zodiac, a belt some thirty degrees wide running across the Northern Hemisphere, which contained climatic conditions ideal for human civilization to blossom. Herein lay the justification for Gilpin’s remarkable, if fanciful, theory that rationalized American exceptionalism. In three letters to the National Intelligencer in the late 1850s, later developed into an influential book, Gilpin outlined how North America’s convex shape had determined its grand destiny. The Mississippi Valley drained the bowl that was defined by the Appalachians to the east and the Sierra Nevada and Rockies to the west. By contrast, the Alps of Europe and the Himalayas of Asia rose in the center of their continents, forming insurmountable barriers to any continental unity. The geographical realities of Europe and Asia broke them up into small states and away from common centers, forcing upon them a history of unending warfare. North America, Gilpin grandly declaimed, had a national, unified personality. Thus endowed with a centripetal, unifying geography that encouraged a single language, the easy exchange of ideas, and favored the emergence of a continental power, North America stood ready to achieve world primacy.
Gilpin claimed that America would fulfill its destiny in the so-called Plateau of North America, the region between the main Rockies and the Sierra Nevada, “the most attractive, the most wonderful, and the most powerful department of their continent, of their country, and of the whole area of the globe.” Here Gilpin shone at his most incandescent, piling sheer fantasy built on pseudo-science and hope ever higher. As the war ended, most Americans had embraced the West as an untapped Eden, not as the barren edge bounding the American nation, but as the very place in which it would fulfill its national destiny.
Certainly, other forces supported such a change of heart about the West. The railroads — America’s most visible instrument of Manifest Destiny — adopted such sentiments with enthusiasm. To encourage the largely authentic, nation-building efforts of the railroad companies, the federal government bestowed vast swaths of public land abutting their tracks onto these rising great powers, many now laying track furiously across the continent. Their long-term interests hinged on the high value of the land they penetrated. The West as garden, rather than desert, suited their ambitions far better, and railroad publicists rolled out a relentless tide of promotional material. Utah was a promised land, proclaimed the Rio Grande and Western Railroad. “You can lay track through the Garden of Eden,” said Great Northern Railroad’s founder J. J. Hill, “[b]ut why bother if the only inhabitants are Adam and Eve?”
A new, supposedly scientific, idea arose to support the vision of productive dryland farming. The “rain follows the plow” theory became chaplain of the western movement. Simply cultivating the arid soil, this theory postulated, will bring about permanent changes in the local climate, turning it more humid and thus favorable to crops. The climatologist Cyrus Thomas, who had founded the Illinois Natural History Society that had given Powell his chance, became one of the theory’s strongest advocates. “Since the territory [of Colorado] has begun to be settled, towns and cities built up, farms cultivated, mines opened, and road made and travelled, there has been a gradual increase in moisture . . . ,” he wrote. “I therefore give it as my firm conviction that this increase is of a permanent nature.” Hayden, along with many other national personalities, endorsed this intoxicating, but deeply flawed theory.
In 1846, Gilpin addressed the U.S. Senate, asserting that “progress is God” and that the “destiny of the American people is to subdue the continent — to rush over this vast field to the Pacific Ocean . . . to change darkness into light and confirm the destiny of the human race. . . . Divine task! Immortal mission!” Even at a time lit up by fiery eloquence, Gilpin stood out, his giddy pronouncements seismic in their appeal, emotionally resonate, wrapped in morality, and nationalistic in self-praise. Few could resist so powerful an appeal. And few did.
Gilpin and Powell had met at least once, in Denver City, on the Major’s first trip west in 1867. The ex-governor had probably waxed about the great promise of the West, perhaps even suggested that the Colorado River lay open to exploration. No record exists of their conversation, but Powell did not seek out his help or opinions after that. The Major found himself more comfortable with William Byers’s gritty practicality.
Indeed, Powell had no truck with the “rain follows the plow” theory. He believed that the Southwest was indeed a desert, one that could be cultivated, but only with the careful marshaling of the limited resource of water. Powell’s urging for caution solicited widespread groans and charges that he was backward-looking. That summer, he quietly ordered his senior investigators west to establish data on irrigation practices. Ostensibly traveling to northern Utah to classify land, Gilbert would examine Mormon water-delivery technology in the Great Salt Lake drainage area. Dutton would continue his geologic studies on the Colorado Plateau, but take some time off to survey irrigable lands in the Sevier River Valley and measure the river’s flow.
***
On March 8, 1878, Representative John Atkins of Tennessee, chair of the House Appropriations Committee, introduced a resolution that called for the secretary of the interior to submit a report summarizing the operations, expenses, and overlaps of the work conducted by geological and geographical surveys over the past ten years. During the consequent hearings, Wheeler, Hayden, and Powell testified about their surveys.
Powell’s young secretary would recall how Wheeler appeared dignified but aloof in his testimony. Hayden came on like a freight train, bitter and at length. He immodestly championed his work above the others and claimed that no duplication among the surveys had occurred. Once Hayden had finally finished his statement, the exhausted committee turned to Powell. In silence, the room of congressmen and a large assembled audience waited as Powell paced back and forth in the chamber, his stump clasped behind his back. All expected an impassioned speech denouncing Hayden’s claims one by one. But Powell ignored the earlier testimony. He gave a calm, even-keeled appraisal of his own work, applauded the achievements of the others, and then contended that much overlap between the surveys had occurred. Soon the entire committee was following his every word. “It was plain to see,” noted his assistant, “that the day was won.”
But even the ascendency he gained at the congressional hearings did not satisfy Powell. Never one to sit back, he prepared to make the riskiest, most brazen gamble of his career — even eclipsing the decision to run the Colorado. One of his greatest intrinsic strengths lay in realizing that opportunity so often arises out of good timing. The timing now — with the survey consolidation in full press and congressional discussion bubbling away— offered an optimal chance to take hold of the narrative and change its course. The report he would release was nothing less than explosive. He would reach far beyond his own survey work, indeed push so far beyond the bounds of a federal bureaucrat as to astound observers, seeming to shoulder the whole American experiment and bear it westward.
While Hayden and Wheeler conducted their fieldwork during the summer of 1877, Powell had stayed home, working assiduously on a document that built on the ideas he had presented to the National Academy of Sciences the year before. His Report on the Lands of the Arid Region of the United States, delivered to Interior Secretary Schurz on April 1, 1878, would be monumental and astonishing, and, in the words of a respected mid-twentieth-century historian, “[o]ne of the most remarkable books ever written by an American.” Starting with Charles A. Schott’s meteorological observations, buttressed by Gilbert’s and Dutton’s ground measurements of water requirements necessary for irrigation, Powell presented a formal, prescriptive plan for developing the West. In this report he integrated a lifetime of thought and observation, ranging from his childhood experiences in the Wisconsin grain fields to his close study of Mormon irrigation techniques, and informed by the network of ancient Pueblo canals and customs of Mexican water sharing. The thousands of miles he had walked, ridden, and climbed in the West keenly but invisibly shaped the document. At its core lay the realization battered into him on his first journey down the Colorado about humanity’s impermanence in the face of geologic time and how the Earth remained in a continual state of flux. It was more manifesto than scientific report, many of its conclusions based on incomplete evidence, much of the data hardly better than educated guesses.
Yet the conclusions have since proved ecologically sound and indeed remarkably spot-on. The report opened with a lengthy appraisal of the topography of the American West, including estimates of the amount of potentially irrigable land, timberland, and pasturage, before launching into a full-frontal assault on the current land-grant system, still rooted in the 1862 Homestead Act’s stipulation that any American adult could receive 160 acres, contingent upon demonstrating an ability to live on the land and improve it. While that system might work well in Wisconsin or Illinois, Powell argued, the arid West could not successfully support 160-acre homesteading. Those westgoers flocking into the arid lands beyond the 100th meridian would see their dreams dashed by spindly crops. Powell had directly contradicted Gilpin’s soaring promises. America could not have everything it wanted.
Powell’s recommendations focused first on classifying lands, then directing their use accordingly: Low-lying lands near water that were west of the 100th meridian should be available in 80-acre lots, while water-limited areas should be parceled into 2,560-acre units for pasturage. High mountain tracts under an abundance of timber should be made available to lumbermen.
He did not deny that drylands could be redeemed, but the limiting factor, as he noted before, was water. Irrigation could “perennially yield bountiful crops,” but the West contained few small streams that could be diverted by canal to fields, and those available were already being exploited to the limit in Utah and Arizona. Such large rivers as the Colorado ran through deep chasms and hostile ground, mostly far from any potential cropland. Only “extensive and comprehensive” actions — dams and distribution systems — could deliver the water, and only those with the means to undertake the task — not individual farmers, being poor men — could pursue it. If not carefully planned, wrote Powell, the control of agriculture would fall into the hands of water companies owned by rich men, who would eventually use their considerable power to oppress the people. He painted a truth that still rankles many today who believe in the myth of the rugged, independent westerner. He asserted that the development of the western lands depended not so much on the individual landowner as on the interdiction of the federal government, the only entity that could survey and map the land, build dams and other reclamation projects, administer vast swaths of public lands, oversee federal land grants, and tackle the displacement of the indigenous peoples. The lone cowboy taming the land with lasso and fortitude may fit the myth of the West, but the reality was quite different. Put simply, the West’s aridity required that overall public interest trump that of the individual.
The man who had previously limited himself to describing the topographic and geologic formations of the western lands had now waded directly into populist politics, driven by isohyets and tables of rainfall-per-acre statistics. Powell believed that the very republican dream of the small farmer was at risk under the crushing power of monopolistic interest. Such resistance aligned with his core childhood beliefs. He had seen the local grain operator in Wisconsin abuse powerless farmers with impunity. The stakes, as he saw them, were of the highest order, threatening the country’s very fulfillment. With the Arid Lands report, Powell had taken on not only Hayden and his congressional supporters, Wheeler and the army but also the General Land Office, the railroads, and the likes of William Gilpin — an overwhelming front of entrenched beliefs, myths, and nation-building passion, the very patrimony of Manifest Destiny. He had taken a hard shot directly at virtually unchallengeable assumptions about the unlimited wealth of American resources and the bright future of the great West — and also at who would have access to whatever wealth the West had to offer.
Powell saw that arid cultures stood or fell — and mostly fell — not on their absolute amounts of water, but on how equitably political and economic systems divided limited resources — and could evolve in the face of climatic and societal changes. To Powell, the Homestead Act, which imposed an arbitrarily eastern 160-acre parcel regardless of topography, rainfall, nearness to water, altitude, and other critical factors, appeared the height of folly, the blind, reflexive policy of a nation with outsized optimism drunk on the seemingly infinite resources available to it. Above all, he argued that the nation’s trustees needed to listen to the land itself — and respond accordingly.
Two days after Powell submitted his Arid Lands report to Schurz, the interior secretary forwarded it along to the House, which ordered 1,800 copies printed. After exhausting that print run quickly, another 5,000 copies printed afterward disappeared equally fast.
***
The Academy committee incorporated much of Powell’s report into their own, nevertheless watering it down considerably by passing over ethnology and his ideas about engineering the landscape. They recommended that the General Land Office’s surveyor generals, along with the three current federal surveys of Hayden, Wheeler, and Powell, be subsumed under two civilian-run agencies in the Interior Department. All land-measurement operation would fall under the Coast and Interior Survey, while all investigations of geology and natural resources, together with land classification, should fall under a new consolidated geological survey. It also recommended that the president appoint a blue-ribbon commission to investigate public-land laws in order to create a new land-parceling system in the arid West, where traditional homesteading was both impractical and undesirable.
On November 6, 1878, the entire Academy approved the report with only one dissenting vote, that of Marsh’s bitter rival Cope. Powell focused next on the congressional backlash that the Academy’s report would surely elicit. After all, it cut out the War Department—and diminished the power of the General Land Office’s sixteen surveyors general and their contractors. And then, of course, Hayden remained capable of hijacking all Powell’s work.
Powell launched a major lobbying effort, calling upon Newberry and Clarence King in late November to sway congressional opinion away from army management of the surveys. Ten days before the Academy presented its report to Congress on December 2, Powell decided not to seek the directorship of the new consolidated survey that Congress would most likely authorize. His deputy Clarence Dutton had written a friend ten days earlier with news that his boss “renounces all claim or desire or effort to be the head of a united survey.” A close observer much later wrote that “no one episode illustrates more strongly the character of the man—to pass voluntarily to another the cup of his own filling when it was at his very lips.”
Noble sentiments may have in fact prompted Powell to step aside, but sheer fatigue with the political infighting could also have played a factor. But Powell had also grown shrewd in politics, anticipating full well that as architect of the survey and land-office reform approach, he would feel the wrath of the vested interests. A general awareness that he was seeking to take the directorship might put the whole endeavor at risk. He now carried great ambitions for two mighty unfolding powers—the nation and science—but not comparable ambitions for his own wealth, power, or glory. When fame came, as it had with the descent of the Colorado, he would harness it to help overcome his next challenge, not to leverage into higher speaking fees, a larger house, or political office. His distaste for self-aggrandizement embodied the Wesleyan requirement of modesty. Work done was for God’s glory, not the individual’s. While Powell worshipped at a different altar, his work, not himself, remained the center of his life. But that did not mean he had stopped fighting to get someone installed to carry on the mission of science in good form.
In his eyes, Hayden had come to stand for the culture of Grant-era corruption after the war. Hayden’s often shoddy science, Powell believed, sent the interests of the United States squarely in a damaging direction. Hayden’s ascent to the position of senior federal scientist would doom land-grant reform. With his willingness to play up to senators and his suspect optimism about the unlimited possibilities of the West, Hayden stood flatly in the way of Powell’s struggle to open minds as to what the West actually offered. In this contest, Powell felt that nothing less than democracy lay on the line.
When Congressman James Garfield asked Powell’s opinion of Hayden’s integrity as a scientist, the Major responded blisteringly that Hayden was “a charlatan who has bought his way to fame.” He was a “wretched geologist” who “rambled aimlessly over a region big enough for an empire,” shamelessly attempting to catch the attention of “the wonder-loving populace.”
Nor had Hayden stood idly by when Congress called upon the National Academy for an opinion: “I presume some great plan will be proposed that will obliterate the present order of things,” Hayden wrote a friend, “unless all our friends take hold and help.” In another letter Hayden told Joseph Hooker that “Hon. Abram Hewitt is an enemy of mine. . . . We had a hard time this last session and came near being decapitated. . . . We had to cultivate the good will of over 300 members to counteract the vicious influence of the [Appropriations] Committee.” Hayden had lobbied members of the Academy to keep John Strong Newberry off the committee. Clarence King topped Powell’s list to run a consolidated survey.
King lived in New York, comfortable with seeking his own fortune and happily above the fray as Hayden, Wheeler, and Powell battled it out. He would do little to seek the directorship, but would be only too happy to accept it if offered. On the other side, Hayden launched a forceful letter-lobbying campaign. Unbeknownst to others, he had begun to suffer the effects of syphilis, very likely contracted from his frequenting of prostitutes. The disease, which would kill him nine years later, had already begun to cloud his judgment. His letter writing, however, appeared to be working. Again Powell countered with more lobbying of his own. In early January, Marsh received a letter from Clarence King, letting him know that King felt it was time to submit his credentials for the job.
Hayden still saw Powell as his major competitor, until when—in the middle of January—a friend notified him of Powell’s withdrawal; ten days later, Hayden wrote a friend that “all looks well now.” Of all the national surveyors, Hayden had published the most, had received more appropriations, and had more friends in Congress—and indeed had the bright feather of Yellowstone in his hat. The directorship was his to lose.
In late December, Powell had finished drafting the legislation that Schurz had requested to turn the Academy’s proposals into law. Powell cleverly tied three of the four proposals to appropriations bills, clearly intending to skirt the Public Lands Committee, crowded with western congressmen who would never allow such issues a hearing. Schurz forwarded them to John Atkins, the chair of the House Appropriations Committee, as well as to Abram Hewitt, the committee’s most influential member. Both strongly supported the measures. Atkins waited until February 10 to open congressional discussion, whereupon several weeks of vigorous debate ensued. Powell kept at work behind the scenes as a very public debate churned over the role of the federal government in the still largely undefined areas of science. He detailed his staff to bring Garfield books from the Library of Congress so he could cogently draft his position against proposed changes by General Humphreys and the Topographical Engineers.
The former Kansas shoe merchant, Representative Dudley C. Haskell, scoffed at federal dollars going to scientists collecting “bugs and fossils” and creating “bright and beautiful topographical maps that are to be used in the libraries of the rich.” Why would Congress reach into public coffers to pay these dubious scientists exorbitant sums to study the public lands? Other opponents of the Academy’s plan argued that the western public domain embraced much fine agricultural land. The West, the Montana newspaperman Martin Maginnis joyfully expounded, “contains in its rich valleys, in its endless rolling pastures, in its rugged mineral-seamed mountains, traversed by thousands of streams clear as crystal and cold as melting snow, all the elements of comfort, happiness, and prosperity to millions of men.” One congressman after another fumed at anyone so fainthearted as to criticize the extraordinary promise of the West. The “genius of our people,” wrote Representative John H. Baker of Indiana, was that they were “bold, independent, self-reliant, full of energy and intelligence,” who “do not need to rely on the arm of a paternal government to carve out their won fortunes or to develop the undiscovered wealth of the mountains.” Then he came to his real point: “I do not want them in their anxiety to perpetuate those or any other scientific surveys to interfere with our settlers upon the frontier.”
With Powell’s finger marks all over the Academy recommendations—much clearly pulled from his Arid Lands report—he now came under direct fire. Thomas Patterson, a former trial lawyer from Colorado, rose to decry Powell as a dangerous revolutionary, “this charlatan in science and intermeddler in affairs of which he has no proper conception.” Atkins’s proposal, he continued, was the work of one man, and threatened the West and its landed interests with disaster. Should Congress enlarge the land grants for grazing, then baronial estates would soon crowd the plains, an aristocratic few owning lands sufficient for a European principality and crowding out the small farmer upon which the nation depended. Powell must have been galled when the floor debate took this particular twist, especially when he had so consciously dedicated his efforts toward supporting the interests of the small farmer and preventing the aggregation of land and power that Patterson railed against. Patterson himself would go on to buy the Rocky Mountain News, making it a bullhorn for labor rights and the taming of corporate overreach. Indeed both men did not diverge much in their views. But at the heart of the matter lay a considerable foundational debate about who should be shaping the development of agricultural America and how much the government and scientific elite should be involved.
On February 18, 1879, Representative Horace Page of California offered a compromise that agreed to the consolidation of the scientific surveys but made no mention of reforming the land-survey system. Representative Haskell read a letter from a National Academy scientist, which submitted that the Academy debate was actually far more divisive than the one dissenting vote might indicate. The congressman would not reveal the letter’s author, most probably E. D. Cope, the missive a ploy by Hayden’s people to sow doubt about the Academy’s recommendations.
Atkins amended Page’s compromise to include the creation of a commission to investigate the land-grant system. The measure passed 98 to 79. The approved Sundry Bill went to the Senate, where no discussion took place. In the Appropriations Committee, Hayden’s supporters weighed in strongly, the committee amending the bill so that the scientific surveys were consolidated under Hayden, even taking $20,000 from Powell to finish up his work and giving it to Hayden. The bill then passed to conference committee. When it emerged on March 3, the last day of the session, the Senate’s emendations placing Hayden in charge had been cut out, but so had the House reformers’ bid to place all the competing agencies under the Interior Department. The last-minute collection of appropriation bills to keep the government functioning passed and the 45th Congress closed.
Hayden may well have considered this outcome a victory, the Senate indicating its interest in his running the consolidated survey. All he needed now was to take the directorship. But he had not counted on Powell. The Major did not delay, writing at length to Atkins on March 4, pinning blame on Hayden for negatively influencing the tenor of the congressional discussion by raising false issues solely to advance himself personally. Powell then revealed his deepest concern: The appointment of Hayden would effectively end efforts to reform the system of land surveys. He asked Atkins to approach Schurz and President Hayes to obstruct Hayden’s bid and to sing the praises of King.
Two days later, Powell spoke with the president, Hayes questioning him in particular on Hayden’s methods of securing appropriations. Powell also wrote a lengthy letter to Garfield, furnishing him with a withering analysis of Hayden’s published work. He did not hold back, claiming that Hayden’s mind was utterly untrained and incoherent, leading him to fritter away federal money on work “intended purely for noise and show.” Powell also worked closely with O. C. Marsh, helping to coordinate the flow of letters in support of King. Marsh traveled to Washington and also met with the president.
Cope wrote Schurz in support of Hayden, claiming that “simply shameful” personal grudges had aroused the voices against his friend. As for King, Cope insinuated that his tenure in government service had been sullied by his taking fees from mining enterprises. But Cope’s letter could not stem the tide of questions raised against Hayden. King’s nomination was officially announced on March 20. “My blood was stirred,” wrote Hayden supporter and Brown University president Ezekiel G. Robinson, upon hearing the news. “There must have been some dexterous maneuvering to have brought about a change in the President’s mind.”
The Senate approved King’s nomination with the slightest opposition on April 3. Three days later Marsh wrote Powell, “Now that the battle is won we can go back to pure Science again,” then invited him and Gilbert to present papers to the upcoming National Academy annual meeting. When Powell told King he would be pleased to work for the new United States Geological Survey, King responded exuberantly. “I am more delighted than I can express. Hamlet with Hamlet left [out] is not to my taste. I am sure you will never regret your decision and for my part, it will be one of the greatest pleasures to forward your scientific work and to advance your personal interest.”
King did not last two years on the job.
Waiting in the wings would be John Wesley Powell, who would take over the directorship of the USGS, run it for 13 years, and fundamentally shape the role of science in the federal government.
***
From The Promise of the Grand Canyon by John F. Ross, published by Viking, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2018 by John F. Ross.
Why Beyoncé Placed HBCU’s at the Center of American Life

When Beyoncé strolled onto Coachella’s desert stage like a drum major on the night of April 16, no one was prepared for the spectacle that was to come. There was, of course, the sheer magnitude of it: She wore a cape and crown of painstaking detail, bedazzled by Olivier Rousteing of Balmain, referencing the ageless black regality of Nefertiti and Michael Jackson. Dozens of monochromatically clad dancers joined Bey, along with a drumline with sousaphone and trombone players. It was an ocean of sound and color against the backdrop of bleachers. “‘Let’s do a homecoming,” she reportedly told her choreographers in early rehearsals.
Perhaps we should’ve been ready. Beyoncé, known for rigorous stagecraft, always promises a spectacle. She’s a pop star who sings soul, although she hasn’t ever tried to be earthy or minimalist like Erykah Badu or Jill Scott, two artists whose work I can tell she pays attention to. I’m sure Beyoncé could pull off a full-length, stripped down, acoustic album if she wanted, but she’s always seemed willfully extra. Her sound is emotive, melismatic, acrobatic, and her visuals are similarly bombastic — a lot of hair, plenty of ass and sweat, and more than a few wardrobe changes.
Yet some of my favorite moments of her career are when she’s focused on fundamentals. Keeping the beat on her lap while performing “Halo” at a children’s hospital, ad-libbing on Frank Ocean’s “Pink and White,” harmonizing on the relaxed, minor-note groove of Destiny’s Child deep cuts like “Get on the Bus,” and “Confessions”. You notice her ear for complex harmonies, the strength of her lower register, the sense of rhythm that makes the delivery of her hooks sticky, and the staccato of her cadences — along with everything else she’s capable of, she’s also more than competent as a rapper.
What I loved most about Bey at Coachella was how her performance drew out elements that have been important in her art for the past 20 years and took them to their logical conclusion — or rather, to their true beginning. She’s long had a brassiness in her voice and she’s always mined black, Southern ways of being for her work. When her sister’s meditative album A Seat at the Table climbed the charts alongside Lemonade in 2016, both of which explicitly pulsed with a brazen black consciousness, Solange told the public not to be surprised. “I’m really proud of my sister and I’m really proud of her record and her work and I’ve always been,” she said to Fader. “As far as I’m concerned, she’s always been an activist from the beginning of her career and she’s always been very, very black.”
If you’re black and from the South, it feels like the culture of HBCUs (Historically Black Colleges and Universities) is in the ether. They are spaces you can’t ignore and wouldn’t want to. Beyoncé was born in Houston and her father graduated from Fisk University. When she was a child in the 1980s and 90s, Spike Lee joints came out almost every other year, and Lee never let us forget that he’d gone to Morehouse, the way Morehouse men are wont to do. The culture of HBCU’s and black Greek life was everywhere: Lee’s 1988 film School Daze and the 1987 TV series A Different World shared similar themes and a few principal cast members, including Jasmine Guy, who was head of the Gamma Ray sorority in the former and iconic B.A.P. Whitley Gilbert in the latter.
That Beyoncé chooses to highlight the specific culture of HBCUs and black Greek life shouldn’t really surprise us, either, and if it does, it feels to me as if we haven’t really been paying attention. A host of black artists have seen black college culture as ripe for the imaginary. At JSTOR Daily, Lavelle Porter reminds us that it was taken up by novelists Ralph Ellison and Nella Larsen at the beginning of the century, and later, by the creators of films and shows like Drumline, Stomp the Yard, and The Quad. To that list,we could add Janelle Monáe, who depicted HBCU life in her 2013 music video “Electric Lady,” as well as Kanye West, whose mother got degrees from Virginia Union and Atlanta University and was the head of the English department at Chicago State for six years.
Growing up, my older sister ran a small business selling Afrocentric gifts and black Greek paraphernalia at Classic ballgames and other events throughout the South. This was the early 90s, when Kenté cloth and Malcolm X fitted caps and medallions were everywhere. One of the T-shirts in our inventory read “The Blacker the College, the Sweeter the Knowledge,” a riff on an old saying about blackness and fecund soulfulness. At a well-attended event at Memphis’ Cook Convention Center, a customer looked me in the eyes and said she knew the future was secure since I’d been such an eloquent and competent salesperson for a fifth grader.
My sophomore year of high school, I visited a few Southern and East coast colleges, both HBCUs and PWIs, on a tour bus with a church group. Spelman felt like home in a way that I didn’t know a place of learning could. Missy Elliot videos played in a student center, women who looked and sounded like people I loved carried full backpacks, answered our questions. When we got to Howard, we were giddy. It was a Friday afternoon in the late spring, and we spent a long time out on the green, buzzing Yard.
Part of the reason I didn’t go to an HBCU was that I was so familiar with them. Now, I wonder what I could have been had I let myself bask in that kind of affirmation for a little bit longer. Nonetheless, I was pretty sure that who I was — a nerdy, bespectacled daughter of a poor-to-working class single mother, wouldn’t easily fit in at one those campuses.
My experiences with wealthier black families in Memphis — and watching Bill Cosby’s shows — made it clear that I needed to aspire to a pristine, black middle-class ideal. I think Cosby’s crimes have given us an opportunity to think about the limits of some of our sacred black spaces, how the pressure to be respectable can force you to abandon or question or edit yourself if you’re poor, or queer, or anything else. By associating herself with HBCUs, Beyoncé challenges those mores with her self-avowed feminist, queer-loving and blatantly sexual art. She helps expand the possibilities of what it looks like to be a black thinking person.
That she chose to share this at Coachella, with its largely wealthy, white audience, wasn’t exactly a disruption. I truly believe that her performance placed HBCUs and black Greek culture at the center of American life, and that’s where they belong. Today, there are 102 HBCUs, a mix of private and public institutions. Most have some relationship with federal or state funding, and none have endowments like those of the oldest, private universities in the northeast, many of which are uncovering their ties to slavery. The share of black college students enrolled in HBCUs has declined in recent years, but the schools do more than their share of the work — enrolling about 9 percent of the nation’s black undergraduates and graduating about 15 percent of them.
They are also American institutions that have an important relationship with our nation’s long march towards democracy. According to W.E.B. Du Bois in his 1935 essay Black Reconstruction:
The first great mass movement for public education at the expense of the state, in the South, came from Negroes. Many leaders before the war had advocated general education, but few had been listened to. Schools for indigents and paupers were supported, here and there, and more or less spasmodically. Some states had elaborate plans, but they were not carried out. Public education for all at public expense, was, in the South, a Negro idea.
Before this mass movement, the South’s leadership did not believe in the “educability of the poor,” and much of the white laboring class in the region saw no need for it, mired as they were in the plantation system’s feudalism. State by state, Reconstruction governments set up tax-based schools that would be open to all. There was resistance to nearly all of this — to the idea of blacks becoming educated, to whites teaching blacks, to the black and white students sharing facilities. As a compromise, secondary schools and colleges were opened specifically to train black teachers. Fisk University opened in 1866, and Howard University was founded in 1867, partly funded by the Freedman’s Bureau. Du Bois said these institutions “became the centers of a training in leadership and ideals for the whole Negro race, and the only fine and natural field of contact between white and black culture.”
A few studies have shown that throughout the world, compulsory education increases voter participation, and increases in education predict social engagement in the sort of groups and organizations that do critical grassroots work. The push for education on the part of emancipated blacks, then, can be considered a driving force in the ever-widening democratization of American life.
Beyoncé’s Coachella sets were a correction to the erasure and historical amnesia that make us feel like she could possibly disrupt something that her forebears had such a heavy hand in creating.
For further reading:
- What Does Bill Cosby’s Problematic Legacy Mean for Black Colleges, Lavelle Porter, JSTOR Daily
- Wake Up! It’s the 30th Anniversary of Spike Lee’s School Daze, Kelley Carter, The Undefeated
- Representing HBCUs: Spike Lee’s “School Daze” at 30, Lavelle Porter, Black Perspectives
- Under Trump, a Hard Test for Howard University, Jelani Cobb, The New Yorker
- My Father And I Both Chose HBCUs, But Not For The Same Reason, Frederick McKindra, Buzzfeed
- March to the Joyous, Raucous Beat of the Sonic Boom of the South, Richard Grant and Zack Arias, Smithsonian
- At Fisk University, A Tradition of Spirituals, Jeff Bossert, NPR
- Ebony and Ivory: Race, Slavery, and the Troubled History of America’s Universities, Craig Steven Wilder, NPR
- 272 Slaves Were Sold to Save Georgetown: What Does it Owe Their Descendants? Rachel L. Swarns, The New York Times
Seeking a Roadmap for the New American Middle Class

Livia Gershon | Longreads | March 2018 | 8 minutes (1,950 words)
Over the past few months, Starbucks, CVS, and Walmart announced higher wages and a range of other benefits like paid parental leave and stock options. Despite what the brands say in their press releases, the changes probably had little to do with the Republican corporate tax cuts, but they do reflect a broader economic prosperity, complete with a tightening a labor market. In the past couple of years, real wages hit their highest levels ever, and even the lowest-paid workers started getting raises. As Matt Yglesias wrote at Vox, “for the first time in a long time, the underlying labor market is really healthy.”
But it doesn’t feel that way, does it? From the new college graduate facing an unstable contract job and mounds of debt to the 30-year-old in Detroit picking up an extra shift delivering pizzas this weekend, it just seems like we’re missing something we used to have.
In a 2016 Conference Board survey, only 50.8 percent of U.S. workers said they were satisfied with their jobs, compared with 61 percent in 1987 when the survey was first done. In fact, job satisfaction hasn’t come close to that first reading in this century. We’re also more anxious and depressed today than we’ve been since the depths of the recession, and we’re dying younger — particularly if we’re poor.
So maybe this is a good moment to stop and think about what really good economic news would look like for American workers. Imagine for a moment that everything goes right. The long, slow recovery from the Great Recession continues, rather than reversing itself and plunging us back into high unemployment. Increased automation doesn’t displace a million truck drivers but creates new, more skilled driving jobs. The retirement of the Baby Boomers reduces labor supply, driving up wages at nursing homes, call centers, and the rest of the gigantic portion of the economy where pay is low.
Would this restore dignity to work and a sense of optimism to the nation? Would it bring back the kind of pride we associate with the 1950s GM line worker?
I don’t think it would. I think it would take far more fundamental changes to win justice for American workers. But I also think it’s possible to strive for something way better than the postwar era we often remember as a Golden Age for workers.
Let’s start by dispelling the idea that postwar advances for American workers were some kind of natural inevitability that could never be replicated today. Yes, in the 1940s, the United States was in a commanding position of economic dominance over potential rivals decimated by war. And yes, companies were able to translate the manufacturing capacity and technological know-how built up through the military into astounding new bounty for consumers. But, when it comes to profitability, business has also had plenty of boom times in recent decades, with no parallel advances for workers.
This is the moment to stop and think about what really good economic news would look like for American workers.
Let’s also set aside the nostalgia about how we used to make shit in this country. Page through Working, Studs Terkel’s classic 1972 book of interviews with a broad range of workers, and factories come across as a kind of hellscape. A spot welder at a Ford plant in Chicago describes standing in one place all day, with constant noise too loud to yell over, suffering frequent burns and blood poisoning from a broken drill, at risk of being fired if he leaves the line to use the bathroom. “Repetition is such that, if you were to think about the job itself, you’d slowly go out of your mind,” he told Terkel.
The stable, routine corporate office work that also thrived in the postwar era certainly wasn’t as unpleasant as that, but there’s a whole world of cultural figures, from Willy Loman to Michael Scott, that suggest it was never an inherent font of meaning.
The fact that the Golden Age brought greater wealth, pride, and status to American workers, both blue- and white-collar, wasn’t really about the booming economy or the nature of the work. It was a result of power politics and deliberate decisions. In the 1930s and ‘40s, unionized workers, having spent decades battling for power on the job, at severe risk to life and livelihood, were a powerful force. And CEOs of massive corporations like General Motors were scared enough of radical workers, and hopeful enough about the prospects of shared prosperity, to strike some deals.
A consensus about how jobs ought to work emerged from these years. Employers would provide decent pay, health insurance, and pensions for large swaths of the country’s workers. The federal government would build a legal framework to address labor disputes and keep corporate monopolies from getting out of control. Politicians from both parties would march in the Labor Day parade every year, and workers would get their fair share of the new American prosperity.
Today, of course, the postwar consensus has broken down. Even if average workers are making more money than we used to, the gap between average and super-rich makes us feel like we’re getting nowhere. We may be able to afford iPhones and big-screen TVs, but we’ve got minimal chances of getting our kids into the elite colleges that define the narrow road to success.
And elite shows of respect for workers ring more and more hollow. Unions, having drastically declined in membership, no longer have a seat at some of the tables they used to. Politicians celebrate businesses’ creation of jobs, not workers’ accomplishment of necessary and useful labor. A lot of today’s masters of industry clearly believe that workers are an afterthought, since robots will soon be able to do anyone’s jobs except theirs.
But let’s not get too nostalgic about the Golden Age. As many readers who are not white men may be shouting at me by this point, there was another side to these mid-century ideas about work. The entire ideological framework defining a job with dignity was inextricably tied up with race and gender.
From the start of the industrial revolution, employers used racism to divide workers. And union calls for respect and higher wages were often inseparable from demands that companies hire only white men. The Golden Age didn’t just provide white, male workers with higher wages than everyone else but also what W.E.B. Du Bois called the “public and psychological wage” of a sense of racial superiority.
Just as importantly, white men in the boom years also won stay-at-home wives. With rising male wages, many white women — and a much smaller number of women of other races — could now focus all their energy on caring for home and family. For the women, that meant escape from working at a mill or cooking meals and doing laundry for strangers. But it also meant greater economic dependence on their husbands. For the men, it was another boost to their living standard and status.
Golden Age corporate policies, union priorities, and laws didn’t create the ideal of the white, breadwinner-headed family, but they did reinforce it. Social Security offered benefits to workers and their dependents rather than to all citizens, and excluded agricultural and domestic workers, who were disproportionately black. The GI Bill helped black men far less than white ones and left out most women except to the extent that their husbands’ benefits trickled down to them.
Let’s also set aside the nostalgia about how we used to make shit in this country.
Today, aside from growing income inequality, unstable jobs, and the ever-skyward climb of housing and education costs, a part of the pain white, male workers are feeling is the loss of their unquestioned sense of superiority.
So, can we imagine a future Golden Age? Is there a way to make working for Starbucks fulfill all of us the way we remember line work at GM fulfilling white men? Maybe. With an incredible force of political will, it might be possible to rejigger the economy so that modern jobs keep getting better. It would start with attacking income inequality head-on. The government could bust up monopolistic tech giants, encourage profit-sharing, and maybe even take a step toward redistributing inherited wealth. We’d also need massive social change to ensure people of color and women equal access to the good new jobs, and men and white people would need to learn to live with a loss of the particular psychological wages of masculinity and whiteness.
But even all that would still fail to address one thing that made work in the Golden Age fulfilling for men: the wives. Stay-at-home moms of the mid-twentieth century weren’t just a handy status symbol for their men. They were household managers and caregivers, shouldering the vast majority of child-raising labor and creating a space where male workers could rest and be served. And supporting a family was a key ingredient that made otherwise draining, demeaning jobs into a source of meaning.
Few men or women see a return to that ideal as a good idea today. But try imagining what good, full-time work for everyone looks like without it. Feminist scholar Nancy Fraser describes that vision as the Universal Breadwinner model — well-paid jobs, with all the pride and status that come with them, for all men and women. She notes that it would take massive spending to outsource childcare and other traditionally unpaid “female” work — particularly since those jobs would need to be good jobs too. It would also leave out people with personal responsibilities that they couldn’t, or wouldn’t, hand over to strangers, as well as many with serious disabilities. And it certainly wouldn’t solve the problem many mothers and fathers report today of having too little time to spend with family.
A really universal solution to the problem of bad jobs would have to go beyond “good jobs” in the Golden Age model. It would be a world where we can take pride in our well-paid jobs at Starbucks without making them the center of our identities. That could mean many more part-time jobs with flexible hours, good pay, and room for advancement. It could mean decoupling benefits like health care and retirement earnings from employment and providing a hefty child allowance. Certainly, it would mean a social and psychological transformation that lets both men and women see caring work, and other things outside paid employment, as fully as valuable and meaningful as a job.
As a bonus, this kind of solution would also make sense when we do fall back into recession, or if the robots do finally come for a big chunk of our jobs.
All this might sound absurdly utopian. We are, after all, living in a world where celebrity business leaders claim to work 80-plus hour weeks while politicians enthusiastically deny health care to people who can’t work.
But the postwar economy didn’t happen on its own. It was the product of a brutal, decades-long fight led by workers with an inspiring, flawed vision. And today, despite everything, new possibilities are emerging. Single-payer health care is a popular idea, and “socialism” has rapidly swung from a slur to a legitimate part of the political spectrum. Self-help books like The 4-Hour Work Week — which posit the possibility of a radically different work-life balance, albeit based on individual moxie rather than social change — have become a popular genre. Young, black organizers in cities across the country are developing their own cooperative economic models. And if there’s any positive lesson we can take from the current political moment, it’s that you never know what could happen in America. Maybe a new Golden Age is possible. It’s at least worth taking some time to think about how we would want it to look.
***
Livia Gershon is a freelance journalist based in New Hampshire. She has written for the Guardian, the Boston Globe, HuffPost, Aeon and other places.
On the Contentious Borders of the American South

Scholar and writer Zandria F. Robinson narrates her coming of age in Memphis while examining the food, music, and accents of contemporary “southernness” for Oxford American. During her teenage tears, the author tried to extricate the South from her voice:
At home in my room with the door closed, I practiced aloud, watching the shape of my mouth and the movements of my tongue in the mirror. I repeated my introduction in different accents: regular, valley girl, Southern, newscaster, New Yorker, and British. I still couldn’t hear how I sounded, but I was desperate to discern and attain a standard American accent—that is, one with no regional mark. I was sixteen years old, trying to make it in the world. I didn’t need no Southern accent perched like a twanging bird on top of my being black and a girl and precariously middle class and a precariously middle-class black girl whose hair wouldn’t get straight all the way no matter the strength or caliber of the relaxer. I switched on the television, hoping to find a Cosby Show rerun so I could study Mrs. Clair Huxtable.
But in echoes of Ralph Ellison’s essay on black regionalism from 1948, “Harlem is Nowhere,” Robinson comes to realize that any notion of “southernness” as separate from “Americanness” is false.
Everybody wants to be Southern but don’t nobody want to be Southern, too. To enjoy the culture, to have gentrified ham hocks, but not to deal with ham hocks’ relationship to slavery or slavery’s relationship to the present and future. Folks want the fried chicken and Nashville and trap country music (an actual thing) and sweet tea, but they don’t want Dylan-with-an-extra-“n” Roof or the monstrous spectacle and violence in Charlottesville or the gross neglect and racism after Katrina. No one wants the parts of the South that make America great again. It’s high time we move beyond the border sketched out in John Egerton’s provocative 1974 book, The Americanization of Dixie: The Southernization of America — the South has been everything below the Canadian border all along. If the Black Lives Matter chapters across Canada weigh in, then the South is above the Canadian border, too. Though I’ll admit that “everything below the Arctic circle” doesn’t have a good ring to it.
Things are dirty on both sides of our nation’s internal border, it’s just that some folks won’t confess it. The borders in us and between us seem ever more real, even as we strive to tear them down in service of one sound, one nation, undivided. But one side always wins, and borders are never neutral. I’m just glad that the border wars in me are over for now … I wonder if America ever will be.
You must be logged in to post a comment.