Search Results for: American Prospect
Doctors Without Patients: The Eritrean Physicians Stuck in American Licensing Limbo

Shoshana Akabas | Longreads | October 2021 | 16 minutes (4,762 words)
*Haben Araya was working in the local hospital when a farmer came in, bleeding from his gums. He was suffering from a snakebite — a case she’d seen many times.
*At the request of the doctors involved, some names have been changed.
Before Araya sought asylum in the United States, before she helplessly watched the COVID-19 pandemic tear across the country, and before she learned about what doctors must go through to relicense in America, she worked as one of a handful of physicians on staff at a local hospital in her home country of Eritrea. She was a general practitioner, responsible for everything from pediatric preventative medicine to minor surgeries and gynecology. She served as the regional appointed physician for malaria case management and the hospital’s Director for Tuberculosis Control. If a patient needed to be transferred to another hospital, she had to write the referral. Call the ambulance. Make sure the ambulance has enough gas. Find someone to fill up the tank.
Snakebite cases were heartbreaking for Araya because she knew the medication was prohibitively expensive: 840 Eritrean Nakfa for a single vial (about 56 USD). Sometimes four or five vials were required, costing more than many farmers would earn in a year.
The hospital insisted on taking some sort of collateral until the bill was paid, but Araya knew the farmers were good for the money. She also knew that they would likely sell their goats or sheep — whatever animals they relied on for their livelihoods — to pay for the treatment. And then, she knew, they and their children would return in a few months’ time with severe cases of malnutrition and a host of consequent health issues.
A nearby military clinic, where there was no on-site physician, had a stock of antivenom. In exchange for a free supply for her patients, Araya told the administrator of the unit that she would provide medical consultation and training. It was not a perfect solution, Araya admits, but her job was to do anything she could for her patients. “We have to do our best with what we know,” she says. “Every day we had to be more than a doctor.”
***
Doctors trained in resource-limited environments possess a unique skill set. They’re adaptable, creative, and work well under pressure. Yet, upon arriving in the U.S., internationally trained physicians like Araya must go through a licensing process so arduous it can take nearly ten years to complete. There are currently an estimated 165,000 internationally trained medical professionals living in the United States and underutilizing their skills. Many, like Araya, are sitting on crisis management experience the United States never thought they would need — until the pandemic hit.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Eritrea has a single medical school: the Orotta College of Medicine and Health Sciences, offering a six-year medical program. With only 30 to 40 spots in each graduating class, the nationwide competition was fierce. “When I applied to medical school, my dad always tried to impress on me that I need to have Plan B and Plan C,” says Lily Yemane, an expat Eritrean physician like Araya. But she couldn’t think of any other job she wanted to do.
In the United States, the pandemic forced many doctors who had never experienced shortages to make life-or-death choices about who would be given oxygen, but for Araya and Yemane, that kind of challenge was part of their regular work as physicians. “You have an idea of how a certain patient can be helped, but you don’t have the resources,” explains Yemane. “Two or three patients need a medication, and you have to decide who to give it to.” With only one or two ambulances per hospital, she often fought to convince the administration to deploy their ambulance for her patients.
Resource scarcity wasn’t the only issue. Living under the oppressive regime in Eritrea bled into every aspect of their personal and professional lives. “We don’t choose where we work, we don’t negotiate our salaries,” says Araya. “The government, basically they put our names in a fishbowl.”
Since President Isais Afwerki came to power following the country’s independence in 1993, freedom has been stifled. Afwerki’s extrajudicial executions, imprisonment of journalists and religious minorities, indefinite forced labor sentences, and other human rights violations have been documented by the United Nations Human Rights Council. Reporters Without Borders, on its World Press Freedom Index this year, ranked Eritrea last, below North Korea. There have been no presidential elections held in the country’s 28-year history. “ … You don’t get any say, you don’t vote. We’ve never voted in our entire life,” says Yemane.
When political prisoners were brought to the hospital for care — often for tuberculosis or scabies, the result of years in captivity — doctors were forced to defer to a system they vehemently opposed. Some prisoners were journalists; others had been caught at the border, trying to flee the country. “You almost never ask why,” says Yemane. “You don’t want to know.”
Each time a prisoner was brought for treatment, Yemane had to convince the guards to admit the patient to the hospital for necessary care, raising suspicions that she was on the prisoner’s side. Except once: Yemane supervised the care of a prisoner with kidney failure. When she went to check on him in the recovery facility, she was surprised to find the patient with his family, and the guards nowhere to be found. “He was free,” she says, “but they only let him go because they thought he was dying.”
There was no single moment that pushed Yemane or Araya to leave and follow their family and friends who had already fled to the US. Instead, the burden of oppression and persecution simply grew until they felt they had no choice. “My rights as a human being were being violated,” says Araya. “I did not have the freedom — that basic, basic freedom … we all deserve as human beings.”
***
Yemane did not arrive in the United States naive to American culture or to the challenge ahead. She’d read plenty of English literature and loved watching Oscar-nominated movies, from My Fair Lady to La La Land. But still, the culture shock was real. While waiting the nine months for her work permit to be approved, she lived with a family member and took an anatomy course at the local public college, working towards a physician assistant’s degree in case she couldn’t relicense. Eager to resume medical practice, she also began volunteering at a free clinic, which helped her to feel more at home as she gradually met more like-minded people.
Reporters Without Borders, on its World Press Freedom Index this year, ranked Eritrea last, below North Korea. There have been no presidential elections held in the country’s 28-year history.
When Araya reached the United States the following year, more than a dozen Eritrean doctors like Yemane — who’d fled in the months before her — warned her of the difficult road ahead. She’d have to have her credentials verified before she could sit for the three intensive U.S. medical licensing exams (USMLE) and apply for a residency program to repeat her training — the last step before finally being able to practice on her own.
For most refugees arriving with few resources, the financial cost — of translating educational records into English, covering the exam fees (nearly $1,000 each), and working a clinical internship (often unpaid) to help get a residency — is prohibitive. And the Eritrean doctors were struggling to get past the very first step in the process. For their primary source verification, authorized representatives from the Eritrean medical school would need to confirm that their documents, including their diploma and transcript, were authentic.
They’d contacted the Educational Commission for Foreign Medical Graduates (ECFMG), a non-governmental, non-profit agency, responsible for primary source verification. Of roughly 3,500 operational institutions in the World Directory of Medical Schools, ECFMG accepts credentials from approximately three-quarters — including the medical school in Eritrea. But when Araya and Yemane’s colleagues applied for verification, the Eritrean administrators wouldn’t respond to ECFMG’s inquiries.
The medical school and placement system in Eritrea, like many countries, is controlled by the government, which has the power to withhold the records of anyone they don’t want to assist. “In the eyes of the government,” says Yemane, “we are traitors — which is not true. We served our country when we were there. I worked with very little pay, like everybody else in the country, for four years, outside of my hometown. And we did serve the people. We did our best. But the government was not understanding of that. So when we left, we were considered traitors.”
Kara Oleyn, Vice President for Programs and Services at ECFMG, was assigned to their case. ECFMG sees 20,000 applications each year, and Oleyn was no stranger to verification challenges. When ISIS infiltrated Iraq and medical school officials fled to the south, Oleyn’s team worked with the Iraqi Ministry of Health to track them down, so they could provide verification for their former students. In Crimea, where both the Russian and Ukrainian governments claimed the medical university, they had to determine who was actually authorized to verify credentials. “We do need to assure the public that the individuals who are going to be laying hands on them have the appropriate credentials,” says Oleyn, “and primary source verification is a big part of that.”
But Araya’s and Yemane’s cases — and the cases of their Eritrean colleagues — stumped Oleyn. “There was absolutely no information coming out of Eritrea,” she says.
Araya and her peers were devastated. “The fact that the government I left was able to affect me here — it was just heartbreaking,” says Araya. “America, they gave me protection to stay here, but the [Eritrean] government was able to retaliate and hold me hostage, even when I’m here.”
In rare cases where verification couldn’t be obtained — often for political asylees — the ECFMG used an alternate process: having three U.S.-licensed physicians who attended the same international school swear on their medical license that they have personal knowledge that the individual graduated from medical school. Unfortunately, the Eritrean medical school, founded less than 20 years ago, had no prior graduates working in the United States to provide testimony.
Oleyn’s three-person team relentlessly contacted any sources they thought might be able to share information. “We were trying to triangulate exams that we knew they took in Sudan with Sudanese officials, and we couldn’t get anywhere,” she says. Even the US Department of State couldn’t offer any contacts in Eritrea besides those already refusing to cooperate. Instead, the State Department confirmed what she recalled the Eritrean applicants had already told her: “They’re not going to reply to you, because they don’t want their physicians … their young, bright, educated people to leave their country.”
Yemane and Araya’s feeling of helplessness intensified as the pandemic rolled through their new homeland, and they watched as the news quickly became saturated with reports of hospitals running out of beds and doctors to care for COVID patients. When Eritrea went into lockdown, they feared for their friends and family left behind. Yemane would close her eyes and remember the limited number of beds in the hospital’s ICU, imagining them all filled. The staff was already underpaid and overworked before the pandemic.
“In a perfect world, when this happens, what do you do? You just go home and you help, and then you come back,” says Yemane. “We could not go back home, even to help, even to contribute.” And in America, she couldn’t help either. “… Imagine sitting with the capacity to do something but not being able to do anything … What was the whole point of your training if you cannot do something, even in a pandemic?”
Many internationally trained doctors have valuable experience working in the thick of SARS and Ebola epidemics, conflict zones, and other limited-resource conditions — not unlike the conditions faced by hospitals across the United States, as doctors scrambled for personal protective equipment. “When you have a shortage in supplies all the time, you get creative,” Yemane explains. “When we didn’t have ventilators, we could make CPAPs out of things that you can access at the hospital. So we have that kind of mindset.”
Jina Krause-Vilmar, the president and CEO of Upwardly Global, a nonprofit organization that provides career services to immigrants and refugees (including several interviewed for this story), says that, despite knowing the risks of COVID-19, their clients were anxious to help and “in tears about the idea that they were standing on the sidelines at a time when their communities were suffering.”
Unable to assist medical efforts directly, Yemane volunteered for a mutual aid society to help with cooking and delivering food to a local homeless encampment, but she wished she could do more. At the height of the pandemic, “that’s when it was most painful,” she says. “You see the hospitals running low on supplies, on skill[ed workers], and you’re sitting at home doing nothing when you could have been out there helping people.”
Yemane would close her eyes and remember the limited number of beds in the hospital’s ICU, imagining them all filled.
In a few select states, desperation finally bred change, and internationally trained physicians were given the opportunity to contribute. New York (home to roughly 13,000 foreign-trained medical professionals not able to make full use of their skills) joined New Jersey, Massachusetts, Nevada, and Colorado in adapting licensing guidelines to allow foreign-trained physicians to help with COVID efforts at various levels — but with limited success.
For some, the application was too difficult. Upwardly Global heard that in one state Russian applicants were deterred because the drop-down menu on the online application accidentally omitted “Russia” as an option for country of origin. Some, like Yemane, applied to the NJ licensing program but never heard back.
“These were emergency policies that were designed and implemented at a time of unprecedented need and at a time when states were trying to mount a response to a public health crisis like no other,” says Jacki Esposito, director of U.S. Policy and Advocacy for World Education Services Global Talent Bridge, a non-profit dedicated to helping international students, immigrants, and refugees achieve their educational and career goals. “So just by virtue of the fact that they were designed and implemented very quickly, there wasn’t the time and the space to consult all of the various stakeholders that would be consulted in a permanent reform process.”
For example, according to Esposito, some states require applicants to have active, valid licenses in another country, but many people — refugees especially — let their licenses lapse to avoid yearly fees and continuing education requirements. Esposito says the application could have required that a foreign license was in good standing when it was last active to accomplish the same goal — of weeding out those applicants with disciplinary actions on their record. “It really was a mix of getting the eligibility requirements right so that they maintain health and safety standards, but at the same time are accessible for applicants,” says Esposito. “Eligibility requirements must be workable for these policies to be effective.”
Without the time to be more intentional about the design of the application process, inform employers about the policy, or conduct outreach to applicants, the opportunity went underutilized. By the end of 2020, the New Jersey Board of Medical Examiners, which operated the most robust program for applicants without residency experience, had received approximately 1,100 applications for temporary medical licenses, but, according to a spokesperson at the New Jersey Division of Consumer Affairs, they issued emergency licenses to only 35 individuals. And according to Gothamist, not all who received emergency licenses were able to secure positions. Many applicants who were eligible for similar programs across the country didn’t know where to look for jobs, and hospitals weren’t sure they were allowed to accept internationally trained applicants — or just thought it was easier to not employ them.
“When push came to shove, the hospitals would rather repurpose a plastic surgeon,” says Tamar Frolichstein-Appel, a senior employment services associate at Upwardly Global, who believes better outcomes could be achieved if healthcare employers, legislators, and NGOs work in partnership. Without buy-in from employers who are willing to hire from this talent pool, a license doesn’t make much of a difference. “It’s a missed opportunity that we have not, as a country, leveraged the immense talent that immigrant and refugee doctors and other healthcare workers offer,” says Esposito.
Amid the crisis, a door was cracked open for a select few. But, by and large, doctors like Araya and Yemane watched the pandemic unfold, stuck outside of a system they desperately wanted to be part of. “We got so antsy to do something,” Yemane says. “It’s a privilege to be able to help in that time, and we didn’t have that.”
***
As more time passed without any news of progress from ECFMG, the persistent uncertainty began to take a toll on the Eritrean doctors stuck in limbo. “A few of us went back to medical school again. But to go to medical school twice in one lifetime — it’s a lot to ask,” says Yemane.
After fleeing Eritrea, another doctor, Abraham Solomon, chose this option to avoid being at the mercy of a stalled bureaucratic process. But he couldn’t simply repeat medical school; he had to go back even further and complete up to 90 credits of undergraduate pre-med requirements before even taking the Medical College Admission Test (MCAT). As he sat through freshman seminars for the second time in his life, he had a strong sense that this situation wasn’t fair, but he had to make peace with it. “What [I] had to do was more important than getting lost in the emotions,” says Solomon, who worked in customer service to pay for school. “At that point, you understand this is something you can’t control.”
Mohamed Khalif, who left Somalia as a refugee when he was two years old, moved around the world with his family before graduating medical school in China. While studying for the USMLE in Washington State, he worked as a security guard and then took night shifts at a pie factory so he could volunteer at a medical clinic. Khalif has valuable skills and is fluent in five languages, including Urdu and Mandarin, but even after he passed the USMLE he failed to match with a residency program. The screening for residency programs filters out candidates without “hands-on” clinical experience in the United States: few applicants can afford unpaid internships, and few institutions are willing to take them on over U.S. medical students. The applications cost Khalif more than $6,000 each year, in addition to flights and hotels for interviews. After four years, he decided he had to go in another direction.
As the founder of the nonprofit Washington Academy for International Medical Graduates (WAIMG), he now advocates for those who face the same challenges and offers professional development opportunities through his organization. Through this work, he met folks with similar stories, like a Japanese neurosurgeon who married an American and moved to the U.S., but, even after passing the USMLE, was still working at Starbucks because she couldn’t match into a residency program. Khalif’s organization hired her for a job that would count as “hands-on” clinical experience to improve her prospects.
“Once she found this job,” says Khalif, “she actually cried. And I felt that. Because that’s what I’ve been through — those kinds of odd jobs — and I cried with her.” These stories keep him hopeful, even though he’s not able to practice: the fact that he’s making it possible for so many others.
***
The matching process is a major concern for Araya, Yemane, and their peers — not having their official transcripts or diplomas will likely pose problems during the difficult process of applying to residencies — once they even reach that stage. This year, only 55 percent of immigrant international medical graduates who applied for residency were matched to first-year positions, compared to 93 percent of U.S. graduates.
And every year Araya and Yemane have spent fighting for the right to even sit the exams has cost them: The more time that passes after a candidate’s graduation year, the harder it can be to secure a residency match.
“When you only consider somebody’s graduating year as a criteria and not know the story behind that, it hurts a lot of people. It hurts a lot of people who are really passionate,” says Araya. “To come here to fight for all these years to go back into your profession — that tells a lot about the persistence and the passion that person has for medicine.”
Amid the crisis, a door was cracked open for a select few. But, by and large, doctors like Araya and Yemane watched the pandemic unfold, stuck outside of a system they desperately wanted to be part of.
Khalif began to look for a solution that wouldn’t require physicians to repeat their entire residency. “Legislators did not know about this match process and this residency process,” says Khalif. “They thought people could apply for residency through Indeed Job Search or something.”
Members from Khalif’s non-profit met with legislators and eventually started gaining traction. “COVID really changed people’s minds,” says Khalif, and in May 2021, Washington Governor Jay Inslee signed into law SHB 1129, which allows limited licenses to be granted to internationally trained doctors in Washington who have completed their USMLE, without requiring residency to be repeated in the U.S. “Once you pass all your exams now, you don’t have to settle for an odd job, or leave the profession like I did,” says Khalif. “You can qualify for a license and work under the supervision of a physician, and you can take care of patients.”
The bill was overwhelmingly supported on both sides. Republican representative Mary Dye says that her small county of Garfield, with only a handful of doctors, has benefited from internationally trained physicians from Bangladesh and South Korea, who can work without the equipment, facilities, and large medical teams that most U.S. doctors rely on. “In rural America, we need people that have different experiences,” Dye explained. “We’re grateful to have … people that are capable of serving in these remote locations, under challenging conditions, with lots of limitations, and still provide wonderful medical care for our community.”
From the rural healthcare crisis to expanding medical access for at-risk populations, advocates believe internationally trained physicians could be part of the solution if given the opportunity. “I think they have a huge role to play in terms of health equity access, because of that cultural language fluency,” says Krause-Vilmar.
“We need to re-envision what the process is for licensure for doctors in the United States,” says Esposito, “so that we are not leaving out people who have 20 years of experience in a field where we know that we need more doctors.”
Without any change in legislation in California, the current residency hurdles are still daunting for Araya and Yemane, who hope that, when the time comes, institutions will consider their circumstances and give them a chance to prove themselves. “We are all a loss for our country,” Araya says. “I hope we’re not a loss here.”
***
One night, more than a year into the investigation process, Oleyn was working late in her Philadelphia office when she received a call from one of the Eritrean applicants. She detailed everything her team had tried — most recently, reaching out to the medical school in Cuba that had a partnership with the Eritrean medical school. But it was another dead end.
“Anything you can think of,” she asked on the phone that night. Anything at all.
In an attempt to leave no stone unturned, the applicants submitted lists of people they’d come into contact with during medical school — in the hope of providing a useful connection. As Oleyn’s team searched for leads through the lists of names, they found that one was a dean at a U.S. medical school. It turned out that a small number of U.S. physicians — faculty members of American medical schools like George Washington University — helped establish the school in Eritrea. The connection provided a glimmer of hope after months of coming up empty-handed.
A caseworker from Oleyn’s team contacted the dean; he didn’t remember the specific students but put them in touch with other American faculty members who had taught or helped design the post-graduate training curriculum in Eritrea. Oleyn’s team asked those physicians to verify the information about the applicants: the courses they took, which textbooks were used, and their graduation dates. They responded enthusiastically about the qualifications of each applicant and eagerly asked how they could help.
The alternate form of verification — with all the supporting evidence they had amassed — was presented to the ECFMG’s board of trustees, which finally granted approval in summer 2020. Araya and Yemane could move forward to the exam stage. When Yemane heard the news, she felt like she’d finally gotten her life back. “There was a time when I was too scared to be hopeful about that because I didn’t want to be disappointed,” she says.
Solomon had just finished a year of intro courses — Biology, Chemistry, and Physics — when the decision was released. He no longer had to repeat the rest of the prerequisite courses and medical school, and he was thankful to finally have some control over the next steps. “This is a challenge I can overcome,” he says. “An exam is just an exam. You study. You prepare.”
“It’s a good thing that we’re doing this exam,” Yemane says. “It’s a good way to revisit the basic sciences and to familiarize ourselves with what’s most important and most common in this country.”
The Eritrean physicians continue to stay in touch through their Whatsapp group, meeting occasionally, sharing job opportunities, and cheering each other on. Araya says she won’t stop rooting for their success. “Passing the exam, getting matched [with a residency program] has become more than even being a doctor: Just proving that the government back home, the school — whoever could not give us our certificates, credentials — that actually, there is justice in the world, and they could not dictate our professional pathways.”
This year, only 55 percent of immigrant international medical graduates who applied for residency were matched to first-year positions, compared to 93 percent of U.S. graduates.
In a thank you note Oleyn received an Eritrean physician wrote: “This shall also afford every graduate the privilege to revisit his/her oath to humanity, to summon his/her medical expertise, and to engage hereafter in the honored service of the people of the United States of America.”
It remains the most gratifying case Oleyn has seen in her 22 years at ECFMG.
***
On a warm Thursday in June 2021, Yemane traveled to San Jose to take her first exam. She hadn’t slept well the night before. Kept awake by nerves, she’d scrolled through Reddit, where other nervous exam-takers shared their anxieties. But in the morning, she pretended she’d had the best sleep of her life. “I think that worked,” she laughs. “I think I fooled my brain.”
The test center was familiar because she’d paid $75 to take a practice exam there earlier that week, but it was nerve-wracking all the same. “There was a lot of pressure on me, because I’m one of the first people taking the exam from my country,” she says. “And we begged for three years for this opportunity.”
She reminded herself that she was prepared. She’d done over 7,000 practice questions. She thought about a text her friend sent, telling her that the test outcome would not change her identity. She imagined her father and mother telling her, “You were created for this.”
When she finished the eight-hour exam, a sense of relief washed over her. This was the hardest test for her; the next one focuses on clinical skills, and she hopes to sit for it in spring 2022. After that, she will take the third and final test. The next challenge — applying for residencies — will be the final step in the long and expensive licensing process.
For now, though, she’s taking one step at a time. As she anxiously awaits the results, she knows that even if she doesn’t get the score she’s hoping for, she was brave just to take the exam after everything she’s been through. “That’s what I’m doing right now,” she says. “I’m celebrating the bravery.”
Shoshana Akabas is a writer and teacher based in New York. She primarily writes fiction and reports on refugee policy and issues of forced migration.
* * *
Editor: Carolyn Wells
Fact checker: Nora Belblidia
The Criminalization of the American Midwife

Jennifer Block | March 2020 | 32 minutes (8,025 words)
Elizabeth Catlin had just stepped out of the shower when she heard banging on the door. It was around 10 a.m. on a chilly November Wednesday in Penn Yan, New York, about an hour southeast of Rochester. She asked her youngest child, Keziah, age 9, to answer while she threw on jeans and a sweatshirt. “There’s a man at the door,” Keziah told her mom.
“He said, ‘I’d like to question you,” Caitlin tells me. A woman also stood near the steps leading up to her front door; neither were in uniform. “I said, ‘About what?’” The man flashed a badge, but she wasn’t sure who he was. “He said, ‘About you pretending to be a midwife.’”
Catlin, a home-birth midwife, was open about her increasingly busy practice. She’d send birth announcements for her Mennonite clientele to the local paper. When she was pulled over for speeding, she’d tell the cop she was on her way to a birth. “I’ve babysat half of the state troopers,” she says.
It was 30 degrees. Catlin, 53, was barefoot. Her hair was wet. “Can I get my coat?” she asked. No. Boots? She wasn’t allowed to go back inside. Her older daughter shoved an old pair of boots, two sizes too big, through the doorway; Catlin stepped into them and followed the officer and woman to the car. At the state trooper barracks, she sat on a bench with one arm chained to the wall. There were fingerprints, mug shots, a state-issue uniform, lock-up. At 7:30 p.m. she was finally arraigned in a hearing room next to the jail, her wrists and ankles in chains, on the charge of practicing midwifery without a license. Local news quoted a joint investigation by state police and the Office of Professional Discipline that Catlin had been “posing as a midwife” and “exploiting pregnant women within the Mennonite community, in and around the Penn Yan area.”
Catlin’s apparent connection with a local OB-GYN practice, through which she had opened a lab account, would prompt a second arrest in December, the Friday before Christmas, and more felony charges: identity theft, falsifying business records, and second-degree criminal possession of a forged instrument. That time, she spent the night in jail watching the Hallmark Channel. When she walked into the hearing room at 8:00 a.m., again in chains, she was met by dozens of women in grey-and-blue dresses and white bonnets. The judge set bail at $15,000 (the state had asked for $30,000). Her supporters had it: Word of her arrest had quickly passed through the tech-free community, and in 12 hours they had collected nearly $8,000 for bail; Catlin’s mother made up the difference. She was free to go, but not free to be a midwife.
Several years back, a respected senior midwife faced felony charges in Indiana, and the county prosecutor allowed that although a baby she’d recently delivered had not survived, she had done nothing medically wrong — but she needed state approval for her work. The case, the New York Times wrote, “was not unlike one against a trucker caught driving without a license.” As prosecutor R. Kent Apsley told the paper, “He may be doing an awfully fine job of driving his truck. But the state requires him to go through training, have his license and be subject to review.”
But what if the state won’t recognize the training or grant a license?
Catlin is a skilled, respected, credentialed midwife. She serves a rural, underserved, uninsured population. She’s everything the state would want in a care provider. But owing to a decades-old political fight over who can be licensed as a midwife, she’s breaking the law. Read more…
When American Media Was (Briefly) Diverse

Danielle A. Jackson | Longreads | September 2019 | 16 minutes (4,184 words)
The late summer night Tupac died, I listened to All Eyez on Me at a record store in an East Memphis strip mall. The evening felt eerie and laden with meaning. It was early in the school year, 1996, and through the end of the decade, Adrienne, Jessica, Karida and I were a crew of girlfriends at our high school. We spent that night, and many weekend nights, at Adrienne’s house.
Our public school had been all white until a trickle of black students enrolled during the 1966–67 school year. That was 12 years after Brown v. Board of Education and six years after the local NAACP sued the school board for maintaining dual systems in spite of the ruling. In 1972, a federal district court ordered busing; more than 40,000 white students abandoned the school system by 1980. The board created specialized and accelerated courses in some of its schools, an “optional program,” in response. Students could enter the programs regardless of district lines if they met certain academic requirements. This kind of competition helped retain some white students, but also created two separate tracks within those institutions — a tenuous, half-won integration. It meant for me, two decades later, a “high-performing school” with a world of resources I knew to be grateful for, but at a cost. There were few black teachers. Black students in the accelerated program were scattered about, small groups of “onlies” in all their classes. Black students who weren’t in the accelerated program got rougher treatment from teachers and administrators. An acrid grimness hung in the air. It felt like being tolerated rather than embraced.
My friends and I did share a lunch period. At our table, we traded CDs we’d gotten in the mail: Digable Planets’s Blowout Comb, D’Angelo’s Brown Sugar, the Fugees’ The Score. An era of highly visible black innovation was happening alongside a growing awareness of my own social position. I didn’t have those words then, but I had my enthusiasms. At Maxwell’s concert one sweaty night on the Mississippi, we saw how ecstasy, freedom, and black music commingle and coalesce into a balm. We watched the films of the ’90s wave together, and while most had constraining gender politics, Love Jones, the Theodore Witcher–directed feature about a group of brainy young artists in Chicago, made us wish for a utopic city that could make room for all we would become.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
We also loved to read the glossies — what ’90s girl didn’t? We especially salivated over every cover of Vibe. Adrienne and I were fledgling writers who experimented a lot and adored English class. In the ’90s, the canon was freshly expanding: We read T.S. Eliot alongside Kate Chopin and Chinua Achebe. Something similar was happening in magazines. Vibe’s mastheads and ad pages were full of black and brown people living, working, and loving together and out front — a multicultural ideal hip-hop had made possible. Its “new black aesthetic” meant articles were fresh and insightful but also hyper-literary art historical objects in their own rights. Writers were fluent in Toni Morrison and Ralph Ellison as well as Biggie Smalls. By the time Tupac died, Kevin Powell had spent years contextualizing his life within the global struggle for black freedom. “There is a direct line from Tupac in a straitjacket [on the popular February 1994 cover] to ‘It’s Obama Time’ [the September 2007 cover, one of the then senator’s earliest],” former editor Rob Kenner told Billboard in a Vibe oral history. He’s saying Vibe helped create Obama’s “coalition of the ascendent” — the black, Latinx, and young white voters who gave the Hawaii native two terms. For me, the pages reclaimed and retold the American story with fewer redactions than my history books. They created a vision of what a multiethnic nation could be.
* * *
“There was a time when journalism was flush,” Danyel Smith told me on a phone call from a summer retreat in Massachusetts. She became music editor at Vibe in 1994, and was editor in chief during the late ’90s and again from 2006 to 2008. The magazine, founded by Quincy Jones and Time, Inc. executives in 1992, was the “first true home of the culture we inhabit today,” according to Billboard. During Smith’s first stint as editor in chief, its circulation more than doubled. She wrote the story revealing R. Kelly’s marriage to then 15-year-old Aaliyah, as well as cover features on Janet Jackson, Wesley Snipes, and Whitney Houston. Smith was at the helm when the magazine debuted its Obama covers in 2007 — Vibe was the first major publication to endorse the freshman senator. When she described journalism as “flush,” Smith was talking about the late ’80s, when she started out in the San Francisco Bay. “Large cities could support with advertising two, sometimes three, alternative news weeklies and dailies,” she said.
‘There is a direct line from Tupac in a straitjacket [on the popular February 1994 cover] to ‘It’s Obama Time’ [the September 2007 cover, one of the then senator’s earliest].’
The industry has collapsed and remade itself many times since then. Pew reports that between 2008 and 2018, journalism jobs declined 25 percent, a net loss of about 28,000 positions. Business Insider reports losses at 3,200 jobs this year alone. Most reductions have been in newspapers. A swell in digital journalism has not offset the losses in print, and it’s also been volatile, with layoffs several times over the past few years, as outlets “pivot to video” or fail to sustain venture-backed growth. Many remaining outlets have contracted, converting staff positions into precarious freelance or “permalance” roles. In a May piece for The New Republic, Jacob Silverman wrote about the “yawning earnings gap between the top and bottom echelons” of journalism reflected in the stops and starts of his own career. After a decade of prestigious headlines and publishing a book, Silverman called his private education a “sunken cost” because he hadn’t yet won a coveted staff role. If he couldn’t make it with his advantageous beginnings, he seemed to say, the industry must be truly troubled. The prospect of “selling out” — of taking a corporate job or work in branded content — seemed more concerning to him than a loss of the ability to survive at all. For the freelance collective Study Hall, Kaila Philo wrote how the instability in journalism has made it particularly difficult for black women to break into the industry, or to continue working and developing if they do. The overall unemployment rate for African Americans has been twice that of whites since at least 1972, when the government started collecting the data by race. According to Pew, newsroom employees are more likely to be white and male than U.S. workers overall. Philo’s report mentions the Women’s Media Center’s 2018 survey on women of color in U.S. news, which states that just 2.62 percent of all journalists are black women. In a write-up of the data, the WMC noted that fewer than half of newspapers and online-only newsrooms had even responded to the original questionnaire.
* * *
According to the WMC, about 2.16 percent of newsroom leaders are black women. If writers are instrumental in cultivating our collective conceptions of history, editors are arguably more so. Their sensibilities influence which stories are accepted and produced. They shape and nurture the voices and careers of writers they work with. It means who isn’t there is noteworthy. “I think it’s part of the reason why journalism is dying,” Smith said. “It’s not serving the actual communities that exist.” In a July piece for The New Republic, Clio Chang called the push for organized labor among freelancers and staff writers at digital outlets like Vox and Buzzfeed, as well as at legacy print publications like The New Yorker, a sign of hope for the industry. “In the most basic sense, that’s the first norm that organizing shatters — the isolation of workers from one another,” Chang wrote. Notably, Vox’s union negotiated a diversity initiative in their bargaining agreement, mandating 40 to 50 percent of applicants interviewed come from underrepresented backgrounds.
“Journalism is very busy trying to serve a monolithic imaginary white audience. And that just doesn’t exist anymore,” Smith told me. U.S. audiences haven’t ever been truly homogeneous. But the media institutions that serve us, like most facets of American life, have been deliberately segregated and reluctant to change. In this reality, alternatives sprouted. Before Vibe’s launch, Time, Inc. executives wondered whether a magazine focused on black and brown youth culture would have any audience at all. Greg Sandow, an editor at Entertainment Weekly at the time, told Billboard, “I’m summoned to this meeting on the 34th floor [at the Time, Inc. executive offices]. And here came some serious concerns. This dapper guy in a suit and beautifully polished shoes says, ‘We’re publishing this. Does that mean we have to put black people on the cover?’” Throughout the next two decades, many publications serving nonwhite audiences thrived. Vibe spun off, creating Vibe Vixen in 2004. The circulations of Ebony, JET, and Essence, legacy institutions founded in 1945, 1951, and 1970, remained robust — the New York Times reported in 2000 that the number of Essence subscribers “sits just below Vogue magazine’s 1.1 million and well above the 750,000 of Harper’s Bazaar.” One World and Giant Robot launched in 1994, Latina and TRACE in 1996. Honey’s preview issue, with Lauryn Hill on the cover, hit newsstands in 1999. Essence spun off to create Suede, a fashion and culture magazine aimed at a “polyglot audience,” in 2004. A Magazine ran from 1989 to 2001; Hyphen launched with two young reporters at the helm the following year. In a piece for Columbia Journalism Review, Camille Bromley called Hyphen a celebration of “Asian culture without cheerleading” invested in humor, complication, and complexity, destroying the model minority myth. Between 1956 and 2008, the Chicago Defender, founded in 1905 and a noted, major catalyst for the Great Migration, published a daily print edition. During its flush years, the Baltimore Afro-American, founded in 1892, published separate editions in Philadelphia, Richmond, and Newark.
Before Vibe’s launch, Time, Inc. executives wondered whether a magazine focused on black and brown youth culture would have any audience at all.
The recent instability in journalism has been devastating for the black press. The Chicago Defender discontinued its print editions in July. Johnson Publications, Ebony and JET’s parent company, filed bankruptcy earlier this year after selling the magazines to a private equity firm in 2016. Then it put up for sale its photo archive — more than 4 million prints and negatives. Its record of black life throughout the 20th century includes images of Emmett Till’s funeral, in which the 14-year-old’s mutilated body lay in state, and Moneta Sleet Jr.’s Pulitzer Prize–winning image of Coretta Scott King mourning with her daughter, Bernice King. It includes casually elegant images of black celebrities at home and shots of everyday street scenes and citizens — the dentists and mid-level diplomats who made up the rank and file of the ascendant. John H. Johnson based Ebony and JET on LIFE, a large glossy heavy on photojournalism with a white, Norman Rockwell aesthetic and occasional dehumanizing renderings of black people. Johnson’s publications, like the elegantly attired stars of Motown, were meant as proof of black dignity and humanity. In late July, four large foundations formed an historic collective to buy the archive, shepherd its preservation, and make it available for public access.
The publications’ written stories are also important. Celebrity profiles offered candid, intimate views of famous, influential black figures and detailed accounts of everyday black accomplishment. Scores of skilled professionals ushered these pieces into being: Era Bell Thompson started out at the Chicago Defender and spent most of her career in Ebony’s editorial leadership. Tennessee native Lynn Norment worked for three decades as a writer and editor at the publication. André Leon Talley and Elaine Welteroth passed through Ebony for other jobs in the industry. Taken together, their labor was a massive scholarly project, a written history of a people deemed outside of it.
Black, Latinx, and Asian American media are not included in the counts on race and gender WMC reports. They get their data from the American Society of News Editors (ASNE), and Cristal Williams Chancellor, WMC’s director of communications, told me she hopes news organizations will be more “aggressive” in helping them “accurately indicate where women are in the newsroom.” While men dominate leadership roles in mainstream newsrooms, news wires, TV, and audio journalism, publications targeting multicultural audiences have also had a reputation for gender trouble, with a preponderance of male cover subjects, editorial leaders, and features writers. Kim Osorio, the first woman editor in chief at The Source, was fired from the magazine after filing a complaint about sexual harassment. Osorio won a settlement for wrongful termination in 2006 and went on to help launch BET.com and write a memoir before returning to The Source in 2012. Since then, she’s made a career writing for TV.
* * *
This past June, Nieman Lab published an interview with Jeffrey Goldberg, editor in chief of The Atlantic since 2016, and Adrienne LaFrance, the magazine’s executive editor. The venerable American magazine was founded in Boston in 1857. Among its early supporters were Ralph Waldo Emerson, Nathaniel Hawthorne, Herman Melville, and Harriet Beecher Stowe. It sought to promote an “American ideal,” a unified yet pluralistic theory of American aesthetics and politics. After more than a century and a half of existence, women writers are not yet published in proportion to women’s share of the country’s population. The Nieman piece focused on progress the magazine has made in recent years toward equitable hiring and promoting: “In 2016, women made up just 17 percent of editorial leadership at The Atlantic. Today, women account for 63 percent of newsroom leaders.” A few days after the piece’s publication, a Twitter user screen-capped a portion of the interview where Goldberg was candid about areas in which the magazine continues to struggle:
GOLDBERG: We continue to have a problem with the print magazine cover stories — with the gender and race issues when it comes to cover story writing. [Of the 15 print issues The Atlantic has published since January 2018, 11 had cover stories written by men. — Ed.]
It’s really, really hard to write a 10,000-word cover story. There are not a lot of journalists in America who can do it. The journalists in America who do it are almost exclusively white males. What I have to do — and I haven’t done this enough yet — is again about experience versus potential. You can look at people and be like, well, your experience is writing 1,200-word pieces for the web and you’re great at it, so good going!
That’s one way to approach it, but the other way to approach it is, huh, you’re really good at this and you have a lot of potential and you’re 33 and you’re burning with ambition, and that’s great, so let us put you on a deliberate pathway toward writing 10,000-word cover stories. It might not work. It often doesn’t. But we have to be very deliberate and efficient about creating the space for more women to develop that particular journalistic muscle.
My Twitter feed of writers, editors, and book publicists erupted, mostly at the excerpt’s thinly veiled statement on ability. Women in my timeline responded with lists of writers of longform — books, articles, and chapters — who happened to be women, or people of color, or some intersection therein. Goldberg initially said he’d been misquoted. When Laura Hazard Owen, the deputy editor at Nieman who’d conducted the interview, offered proof that Goldberg’s statements had been delivered as printed, he claimed he had misspoken. Hazard Owen told the L.A. Times she believes that The Atlantic is, overall, “doing good work in diversifying the staff there.”
Taken together, their labor was a massive scholarly project, a written history of a people deemed outside of it.
Still, it’s a difficult statement for a woman writer of color to hear. “You literally are looking at me and all my colleagues, all my women colleagues and all my black colleagues, all my colleagues of color and saying, ‘You’re not really worthy of what we do over here.’ It’s mortifying,” Smith told me. Goldberg’s admission may have been a misstatement, but it mirrors the continued whiteness of mainstream mastheads. It checks out with the Women’s Media Center’s reports and the revealing fact of how much data is missing from even those important studies. It echoes the stories of black women who work or worked in journalism, who have difficulty finding mentors, or who burn out from the weight of wanting to serve the chronically underserved. It reflects my own experiences, in which I have been told multiple times in a single year that I am the only black woman editor that a writer has ever had. But it doesn’t corroborate my long experience as a reader. What happened to the writers and editors and multihyphenates from the era of the multicultural magazine, that brief flash in the 90’s and early aughts when storytellers seemed to reflect just how much people of color lead in creating American culture? Who should have formed a pipeline of leaders for mainstream publications when the industry began to contract?
* * *
In addition to her stints at Vibe, Smith also edited for Billboard, Time, Inc. publications, and published two novels. She was culture editor for ESPN’s digital magazine The Undefeated before going on book leave. Akiba Solomon is an author, editor of two books, and is currently senior editorial director at Colorlines, a digital news daily published by Race Forward. She started an internship at YSB in 1995 before going on to write and edit for Jane, Glamour, Essence, Vibe Vixen, and The Source. She told me that even at magazines without predominantly black staff, she’d worked with other black people, though not often directly. At black magazines, she was frequently edited by black women. “I’ve been edited by Robin Stone, Vanessa DeLuca [formerly editor-in-chief of Essence, currently running the Medium vertical ZORA], Ayana Byrd, Kierna Mayo, Cori Murray, and Michaela Angela Davis.” Solomon’s last magazine byline was last year, an Essence story on black women activists who organize in culturally relevant ways to fight and prevent sexual assault.
Solomon writes infrequently for publications now, worn down by conditions in journalism she believes are untenable. At the hip-hop magazines, the sexism was a deterrent, and later, “I was seeing a turn in who was getting the jobs writing about black music” when it became mainstream. “Once folks could divorce black music from black culture it was a wrap,” she said. At women’s magazines, Solomon felt stifled by “extremely narrow” storytelling. Publishing, in general, Solomon believes, places unsustainable demands on its workers.
When we talk about the death of print, it is infrequent that we also talk about the conditions that make it ripe for obsolescence. The reluctant slowness with which mainstream media has integrated its mastheads (or kept them integrated) has meant the industry’s content has suffered. And the work environments have placed exorbitant burdens on the people of color who do break through. In Smith’s words:
You feel that you want to serve these people with good and quality content, with good and quality graphics, with good and quality leadership. And as a black person, as a black woman, regardless of whether you’re serving a mainstream audience, which I have at a Billboard and at Time, Inc., or a multicultural audience, which I have at Vibe, it is difficult. And it’s actually taken me a long time to admit that to myself. It does wear you down. And I ask myself why have I always, always stayed in a job two and a half to three years, especially when I’m editing? It’s because I’m tired by that time.
In a July story for Politico, black journalists from The New York Times and the Associated Press talked about how a sophisticated understanding of race is critical to ethically and thoroughly covering the current political moment. After the August 3 massacre in El Paso, Lulu Garcia-Navarro wrote how the absence of Latinx journalists in newsrooms has created a vacuum that allows hateful words from the president to ring unchallenged. Lacking the necessary capacity, many organizations cover race related topics, often matters of life and death, without context or depth. As outlets miss the mark, journalists of color may take on the added work of acting as the “the black public editor of our newsrooms,” Astead Herndon from the Times said on a Buzzfeed panel. Elaine Welteroth wrote about the physical exhaustion she experienced during her tenure as editor in chief at Teen Vogue in her memoir More Than Enough. She was the second African American editor in chief in parent company Condé Nast’s 110 year history:
I was too busy to sleep, too frazzled to eat, and TMI: I had developed a bizarre condition where I felt the urge to pee — all the time. It was so disruptive that I went to see a doctor, thinking it may have been a bladder infection.
Instead, I found myself standing on a scale in my doctor’s office being chastised for accidentally dropping nine more pounds. These were precious pounds that my naturally thin frame could not afford to lose without leaving me with the kind of bony body only fashion people complimented.
Condé Nast shuttered Teen Vogue’s print edition in 2017, despite record-breaking circulation, increased political coverage, and an expanded presence on the internet during Welteroth’s tenure. Welteroth left the company to write her book and pursue other ventures.
Mitzi Miller was editor in chief of JET when it ran the 2012 cover story on Jordan Davis, a Florida teenager shot and killed by a white vigilante over his loud music. “At the time, very few news outlets were covering the story because it occurred over a holiday weekend,” she said. To write the story, Miller hired Denene Millner, an author of more than 20 books. With interviews from Jordan’s parents, Ron Davis and Lucy McBath, the piece went viral and was one of many stories that galvanized the contemporary American movement against police brutality.
Miller started working in magazines in 2000, and came up through Honey and Jane before taking the helm at JET then Ebony in 2014. She edits for the black website theGrio when she can and writes an occasional piece for a print magazine roughly once a year. Shrinking wages have made it increasingly difficult to make a life in journalism, she told me. After working at a number of dream publications, Miller moved on to film and TV development.
Both Miller and Solomon noted how print publications have been slow to evolve. “It’s hard to imagine now, particularly to digital native folks, but print was all about a particular format. It was about putting the same ideas into slightly different buckets,” Solomon said. On the podcast Hear to Slay, Vanessa DeLuca spoke about how reluctant evolution may have imperiled black media. “Black media have not always … looked forward in terms of how to build a brand across multiple platforms.” Some at legacy print institutions still seem to hold internet writing in lower esteem (“You can look at people and be like, well, your experience is writing 1,200-word pieces for the web and you’re great at it, so good going!” were Goldberg’s words to Nieman Lab). Often, pay structures reflect this hierarchy. Certainly, the internet’s speed and accessibility have lowered barriers to entry and made it such that rigor is not always a requirement for publication. But it’s also changed information consumption patterns and exploded the possibilities of storytelling.
Michael Gonzales, a frequent contributor to this site and a writer I’ve worked with as an editor, started in magazines in the 1980s as a freelancer. He wrote for The Source and Vibe during a time that overlapped with Smith’s and Solomon’s tenures, the years now called “the golden era of rap writing.” The years correspond to those moments I spent reading magazines with my high school friends. At black publications, he worked with black women editors all the time, but “with the exception of the Village Voice, none of the mainstream magazines employed black editors.” Despite the upheaval of the past several years (“the money is less than back in the day,” he said), Gonzales seems pleased with where his career has landed, “I’ve transformed from music critic/journalist to an essayist.” He went on to talk about how now, with the proliferation of digital magazines:
I feel like we’re living in an interesting writer time where there are a number of quality sites looking for quality writing, especially in essay form. There are a few that sometimes get too self-indulgent, but for the most part, especially in the cultural space (books, movies, theater, music, etc.), there is a lot of wonderful writing happening. Unfortunately you are the only black woman editor I have, although a few years back I did work with Kierna Mayo at Ebony.
* * *
Danielle A. Jackson is a contributing editor at Longreads.
Editor: Sari Botton
Fact checker: Steven Cohen
Copy editor: Jacob Z. Gross
American Green

Ted Steinberg | American Green | W. W. Norton & Company | March 2006 | 43 minutes (7,070 words)
Although there are plenty of irrational aspects to life in modern America, few rival the odd fixation on lawns. Fertilizing, mowing, watering — these are all-American activities that, on their face, seem reasonable enough. But to spend hundreds of hours mowing your way to a designer lawn is to flirt, most would agree, with a bizarre form of fanaticism. Likewise, planting a species of grass that will make your property look like a putting green seems a bit excessive — yet not nearly as self-indulgent as the Hamptons resident who put in a nine-hole course with three lakes, despite being a member of an exclusive golf club located across the street. And what should we make of the Houston furniture salesman who, upon learning that the city was planning to ban morning mowing — to fight a smog problem comparable to Los Angeles’s — vowed to show up, bright and early, armed and ready to cut.“I’ll pack a sidearm,” he said. “What are they going to do, have the lawn police come and arrest me?”
Surprisingly, the lawn is one of America’s leading “crops,” amounting to at least twice the acreage planted in cotton. In 2007, it was estimated that there were roughly twenty-five to forty million acres of turf in the United States. Put all that grass together in your mind and you have an area, at a minimum, about the size of the state of Kentucky, though perhaps as large as Florida. Included in this total were fifty-eight million home lawns plus over sixteen thousand golf-course facilities (with one or more courses each) and roughly seven hundred thousand athletic fields. Numbers like these add up to a major cultural preoccupation.
Reckoning With Georgia’s Increasing Suppression of Asian American Voters

Anjali Enjeti | Longreads | December 2018 | 18 minutes (4620 words)
Early on November 6, Election Day, Kavi Vu noticed that some voters appeared distressed as they exited Lucky Shoals Park Recreation Center, one of five polling places in Gwinnett County, Georgia. A volunteer with the nonprofit, nonpartisan civil rights organization Asian Americans Advancing Justice — Atlanta (“Advancing Justice”), Vu had been standing outside to answer questions about voting and offer her services as a Vietnamese translator.
When she began asking the mostly African American, Asian American and Latinx voters about their voting experiences, she learned that after 2.5 hour wait times, many of them had voted via provisional ballots.
Why? As it turned out, Lucky Shoals was not their correct voting location. “A lot of people had lived in Gwinnett County their entire lives and voted at the same location and all of the sudden they were switched up to new location,” Vu said.
So when poll workers offered voters the option of voting at Lucky Shoals with provisional ballots, rather than driving elsewhere to wait in another line, the voters took them up on it. They left with I’m a Georgia Voter stickers, and printed instructions for how to cure their ballots. But poll workers didn’t verbally explain to the voters that they’d need to appear at the county registrar’s office within three days to cure their ballots, nor did the poll workers make it clear that the votes would not count at all if the voters failed to do so. What’s more, as the day wore on, poll workers ran out of the provisional ballot instructions altogether.
Vu was alarmed. In an attempt to reduce the number of voters using provisional ballots, she began offering to help voters locate their correct polling place using the Secretary of State website. That’s when poll workers repeatedly began confronting her about her presence outside of the polling place. “They told me to stop speaking with voters in line, even after I explained what I was doing.”
By mid-afternoon, Vu counted some 100 voters who had wrongly reported to Lucky Shoals. When she finally left eight hours after arriving, she was “heartbroken,” over the dreadful conditions at the polling place and the number of votes by minority voters that would likely never be counted.
Taming the Great American Desert

John F. Ross | The Promise of the Grand Canyon | Viking | July 2018 | 24 minutes (6,540 words)
In April 1877, the normally staid proceedings of the National Academy of Sciences’ annual meeting in Washington took a dramatic turn. For two weeks, members had listened to the nation’s most distinguished scientists speak on topics ranging from lunar theory to the structures of organic acids. Members enjoyed “Results of Deep Sea Dredging,” by the son of the recently deceased scientist Louis Agassiz. The Academy had invited G. K. Gilbert to deliver a paper, “On the Structure of the Henry Mountains,” so named in honor of the Academy’s president by Powell’s survey. On the final day, the geologists took the floor, whereupon erupted a furious discussion of the American West. The rub lay between those who studied the fossils and those who examined the rock strata, each drawing wildly different conclusions about the age of their subjects.
Such was the fervor of the discussion that the geologists soon jumped to their feet in animation and anger. “[W]hat they might do if they once went fairly on the rampage, it is impossible to say,” wrote one correspondent. Hayden rose to argue that no great degree of difference existed between the two sides, but others immediately shouted him down.
Yet while the rather scholarly debates over dating and provenance might animate the geologists, that day would be remembered not for these petty theatrics, but for an address Powell delivered. In it, the Major stepped away from the fields of geology and out of academic realms to address a topic that pressed right to the heart of American democracy. During the Townsend Hearings three years earlier, he had raised the issue of the West’s extreme aridity and the difficulty of irrigating much of it — but he had thought a lot more about it since then, and the map he now unrolled in front of America’s top scientists carried startling implications. He had bisected the map of the nation from Mexico to Canada with a vertical line rising from central Texas up through Kansas, east of Nebraska, and through Minnesota, roughly approximating the 100th meridian. At this line the arid West begins with startling consistency, the tall prairie grass cedes to short grass and less fertile soils. Trees appear rarely west of the line, except at high altitudes and in the Pacific Northwest, while forests dominate the east: The 100th meridian elegantly divides two separate lands, one composed of wide horizontal vistas, so much of the other defined by its vertical prospects.
The land west of the 100th meridian, Powell announced, could not support conventional agriculture. Surprise met this bold statement, for the line clearly indicated that much of the great plains — including all of Colorado, Montana, Wyoming, and Idaho, plus Arizona and New Mexico — was essentially unfarmable. Here was the professor at his best: clear, authoritative, dramatic. He had everyone’s attention.
Powell had drawn an isohyet, a line connecting areas that experience equal volumes of annual rainfall. The relatively humid lands to the east of this line experience twenty or more inches of annual rainfall, the unquestionably arid lands to the west receiving less than that, except some narrow strips on the Pacific coast. The twenty-inch isohyet offered a valuable generalization — conventional agriculture simply could not work without twenty or more inches a year, unless supplemented by irrigation. Except for some lands offering timber or pasturage, the far greater part of the land west of the line was by itself essentially not farmable. Access to the transformative powers of water, not the availability of plots of land, proved a far more valuable commodity. By now, any land through which streams passed had all been acquired, some of these owners charging those less fortunate for irrigation water. “All the good public lands fit for settlement are sold,” Powell warned. “There is not left unsold in the whole United States land which a poor man could turn into a farm enough to make one average county in Wisconsin.”
Much of what Powell reported was not exactly new, but no one had presented the data so comprehensively and convincingly — and not anyone so famous as the Major. Few, of course, doubted the region’s aridity. But in one powerful moment, Powell had claimed that the nation’s traditional system of land use and development — and thus America’s present push west — simply would not work. The debate that Powell provoked that late April day drew immediate and blistering response. The land agent for the Northern Pacific Railway, itself the beneficiary of a government grant of nearly four million acres, hammered back at Powell’s “grave errors.” “[P]ractical farmers, by actual occupancy and cultivation, have demonstrated that a very considerable part of this ‘arid’ region, declared by Major Powell as ‘entirely unfit for use as farming lands,’ is, in fact, unexcelled for agricultural purposes.” Others responded similarly. Powell clearly had touched a raw nerve. Over the next several years, he would have much more to say on the matter, igniting a veritable firestorm. While the other surveyors limited themselves to covering as much ground as possible, Powell now wrestled with the startling implications for the ongoing development of the West — and what that meant for the American democracy he had fought so hard to save.
***
For most of the first half of the 19th century, eastern America’s conception of the western portion of North America could be spelled out in three words: Great American Desert. That originated during the Long Expedition of 1819, when President James Monroe directed his secretary of war to send Stephen H. Long of the U.S. Army Corps of Topographical Engineers with a small complement of soldiers and civilian scientists on a western reconnaissance. Secretary of State John Quincy Adams had just negotiated a treaty with Spain that ceded Florida to the United States and drew a border between the two countries running across the Sabine River in Texas, west along the Red and Arkansas rivers, and all the way to the Pacific. Eager to know more about the border and the new western territory, Monroe had the secretary of war direct Long to follow the Platte River up to the Rocky Mountains, then trace south and back east along the new border.
The energetic New Hampshire–born West Pointer envisioned himself the successor to Meriwether Lewis and William Clark — indeed, over the course of five expeditions, he would cover 26,000 miles, and mount the first steamboat exploration up the Missouri into Louisiana Purchase territory. His name would grace the peak that Powell was first to climb. On this expedition, Long split his group into two, sending one party along the Arkansas while he with the rest headed south to chart the Red River. Long’s men, often parched and starving, battled a violent hailstorm, sometimes resorted to eating their horses, and negotiated their way past a band of Kiowa-Apaches. But the maps they carried were so atrociously inaccurate that the river they followed for weeks was not the Red at all.
***
Three years after Long’s party returned home, expedition member Edwin James published the three-volume Account of an Expedition from Pittsburgh to the Rocky Mountains. Long’s ordeal imbued him with little affection for the “dreary plains” they had traversed. The Great Plains from Nebraska to Oklahoma he found were “wholly unfit for cultivation and of course uninhabitable by a people depending on agriculture.” He added: “The traveler who shall at any time have traversed its desolate sands, will, we think, join us in the wish that this region may forever remain the unmolested haunt of the native hunter, the bison, the jackall.” The accompanying map labeled the area a “Great Desert,” terminology that soon fully flowered into the “Great American Desert,” a colorful appellation that would stick to the indefinable sections of the West for the next generation. Long believed that this desert wilderness served as a natural limitation on American western settlement, acting as an important buffer against the Mexican, British, and Russians, who claimed the western lands beyond. That compelling assertion seemed to resonate in the public imagination, locking into place the notion of a vast desert dominating the nation’s western midsection. “When I was a schoolboy,” wrote Colonel Richard Irving Dodge in 1877, “my map of the United States showed between the Missouri River and the Rocky Mountains a long and broad white blotch, upon which was printed in small capitals THE GREAT AMERICAN DESERT — UNEXPLORED.
Even though some early trappers and mountain men had brought back word of a land often far from desertlike, the idea persisted. In 1844, when U.S. naval officer Charles Wilkes published his five-volume Narrative of the United States Exploring Expedition, it included a map of upper California. Inland from the well-detailed Pacific coast lay the Sierra Nevada, while the front range of the Rockies marked the map’s eastward extension. In between the ranges lay a vast, wedge-shaped blank space, without a single physical feature delineated. Unable to leave such a realm blank without remark, Wilkes had inserted a simple paragraph reading “This Plain is a waste of Sand. . . .” Like the sea monsters inhabiting the unknown sections of medieval maps, he — like Long — had condemned the entire region, the dead space not even worthy of a second look. Eleven years later, a Corps of Topographical Engineers map had sought to add additional detail, but could only insert a tenuous dotted line that indicated some cartographer’s wild guess about the Colorado River’s course.
Cracks started appearing in the notion of a Great American Desert during the early 1840s expeditions of Charles Frémont, son-in-law of that powerful advocate of Manifest Destiny, Senator Thomas Benton. With his backing, Frémont led both a four-month survey of the newly blazed Oregon Trail in 1841 and an audacious fourteen-month, 6,475-mile circuit of the West, beginning in 1843. Frémont’s subsequent reports combined a deft mix of hair-raising adventure with scientific discovery, thrilling its readers with images of guide Kit Carson and the so-called Pathfinder himself running up a flag atop a vertiginous Rocky Mountain peak. The maps accompanying the reports furnished emigrants with an accurate road map for the journeys that thousands would take west in the 1840s and 1850s. Frémont’s reports indicated that the intercontinental west certainly contained stretches of truly arid land, but that it was no unbroken Sahara. Yet even so, the pioneers and gold seekers understood that great opportunities lay not in this parched region, but beyond, at the end of the trails, in Oregon and California. Most of the West still remained no more than a place to get across.
In the late 1850s, a rather startling shift had turned the idea of the Great American Desert on its head. “These great Plains are not deserts,” wrote William Gilpin in a late 1857 edition of the National Intelligencer, “but the opposite, and are the cardinal basis of the future empire of commerce and industry now erecting itself upon the North American Continent.” Gilpin, the electric-tongued son of a wealthy Philadelphia Quaker paper merchant, would do more than any other single individual to persuade his fellow citizens that America’s great midsection was a garden only waiting to be plowed. Whereas the term Manifest Destiny had been coined as a justification for conquering great swaths of the continent at gunpoint, Gilpin transformed it into a more wholesome interpretation that pulled peoples across the nation. It also had the weight of the Enlightenment’s commandment, articulated by philosopher John Locke that God and reason commanded humans to subdue the earth and improve it. As Civil War soldiers returned home, all America could climb on board with Gilpin’s fantastical promises, any threatening idea of a great desert now disregarded. He had given America what it most wanted to hear: the promise that its growth was unlimited, its western lands a never-ending buffet of opportunity and growth, limited only by a lack of imagination and courage.
Gilpin had impressive credentials: Not only had he joined Frémont and Kit Carson on their expedition to Oregon in 1843, but as an army officer he had fought the Seminoles in Florida, served as a major in the First Missouri Volunteers during the Mexican War, and marched against the Comanche to keep the Santa Fe Trail open. A columnist for the Kansas City Star observed that “his enthusiasm over the future of the West was almost without limitation.” He became a disciple of Alexander von Humboldt, the great German geographer, who published the early volumes of his Cosmos in the late 1840s, elaborating the thesis that geography, climate, and biota incontrovertibly shaped the growth of human society. Gilpin pressed the Humboldtian idea that much of North America lay within an Isothermal Zodiac, a belt some thirty degrees wide running across the Northern Hemisphere, which contained climatic conditions ideal for human civilization to blossom. Herein lay the justification for Gilpin’s remarkable, if fanciful, theory that rationalized American exceptionalism. In three letters to the National Intelligencer in the late 1850s, later developed into an influential book, Gilpin outlined how North America’s convex shape had determined its grand destiny. The Mississippi Valley drained the bowl that was defined by the Appalachians to the east and the Sierra Nevada and Rockies to the west. By contrast, the Alps of Europe and the Himalayas of Asia rose in the center of their continents, forming insurmountable barriers to any continental unity. The geographical realities of Europe and Asia broke them up into small states and away from common centers, forcing upon them a history of unending warfare. North America, Gilpin grandly declaimed, had a national, unified personality. Thus endowed with a centripetal, unifying geography that encouraged a single language, the easy exchange of ideas, and favored the emergence of a continental power, North America stood ready to achieve world primacy.
Gilpin claimed that America would fulfill its destiny in the so-called Plateau of North America, the region between the main Rockies and the Sierra Nevada, “the most attractive, the most wonderful, and the most powerful department of their continent, of their country, and of the whole area of the globe.” Here Gilpin shone at his most incandescent, piling sheer fantasy built on pseudo-science and hope ever higher. As the war ended, most Americans had embraced the West as an untapped Eden, not as the barren edge bounding the American nation, but as the very place in which it would fulfill its national destiny.
Certainly, other forces supported such a change of heart about the West. The railroads — America’s most visible instrument of Manifest Destiny — adopted such sentiments with enthusiasm. To encourage the largely authentic, nation-building efforts of the railroad companies, the federal government bestowed vast swaths of public land abutting their tracks onto these rising great powers, many now laying track furiously across the continent. Their long-term interests hinged on the high value of the land they penetrated. The West as garden, rather than desert, suited their ambitions far better, and railroad publicists rolled out a relentless tide of promotional material. Utah was a promised land, proclaimed the Rio Grande and Western Railroad. “You can lay track through the Garden of Eden,” said Great Northern Railroad’s founder J. J. Hill, “[b]ut why bother if the only inhabitants are Adam and Eve?”
A new, supposedly scientific, idea arose to support the vision of productive dryland farming. The “rain follows the plow” theory became chaplain of the western movement. Simply cultivating the arid soil, this theory postulated, will bring about permanent changes in the local climate, turning it more humid and thus favorable to crops. The climatologist Cyrus Thomas, who had founded the Illinois Natural History Society that had given Powell his chance, became one of the theory’s strongest advocates. “Since the territory [of Colorado] has begun to be settled, towns and cities built up, farms cultivated, mines opened, and road made and travelled, there has been a gradual increase in moisture . . . ,” he wrote. “I therefore give it as my firm conviction that this increase is of a permanent nature.” Hayden, along with many other national personalities, endorsed this intoxicating, but deeply flawed theory.
In 1846, Gilpin addressed the U.S. Senate, asserting that “progress is God” and that the “destiny of the American people is to subdue the continent — to rush over this vast field to the Pacific Ocean . . . to change darkness into light and confirm the destiny of the human race. . . . Divine task! Immortal mission!” Even at a time lit up by fiery eloquence, Gilpin stood out, his giddy pronouncements seismic in their appeal, emotionally resonate, wrapped in morality, and nationalistic in self-praise. Few could resist so powerful an appeal. And few did.
Gilpin and Powell had met at least once, in Denver City, on the Major’s first trip west in 1867. The ex-governor had probably waxed about the great promise of the West, perhaps even suggested that the Colorado River lay open to exploration. No record exists of their conversation, but Powell did not seek out his help or opinions after that. The Major found himself more comfortable with William Byers’s gritty practicality.
Indeed, Powell had no truck with the “rain follows the plow” theory. He believed that the Southwest was indeed a desert, one that could be cultivated, but only with the careful marshaling of the limited resource of water. Powell’s urging for caution solicited widespread groans and charges that he was backward-looking. That summer, he quietly ordered his senior investigators west to establish data on irrigation practices. Ostensibly traveling to northern Utah to classify land, Gilbert would examine Mormon water-delivery technology in the Great Salt Lake drainage area. Dutton would continue his geologic studies on the Colorado Plateau, but take some time off to survey irrigable lands in the Sevier River Valley and measure the river’s flow.
***
On March 8, 1878, Representative John Atkins of Tennessee, chair of the House Appropriations Committee, introduced a resolution that called for the secretary of the interior to submit a report summarizing the operations, expenses, and overlaps of the work conducted by geological and geographical surveys over the past ten years. During the consequent hearings, Wheeler, Hayden, and Powell testified about their surveys.
Powell’s young secretary would recall how Wheeler appeared dignified but aloof in his testimony. Hayden came on like a freight train, bitter and at length. He immodestly championed his work above the others and claimed that no duplication among the surveys had occurred. Once Hayden had finally finished his statement, the exhausted committee turned to Powell. In silence, the room of congressmen and a large assembled audience waited as Powell paced back and forth in the chamber, his stump clasped behind his back. All expected an impassioned speech denouncing Hayden’s claims one by one. But Powell ignored the earlier testimony. He gave a calm, even-keeled appraisal of his own work, applauded the achievements of the others, and then contended that much overlap between the surveys had occurred. Soon the entire committee was following his every word. “It was plain to see,” noted his assistant, “that the day was won.”
But even the ascendency he gained at the congressional hearings did not satisfy Powell. Never one to sit back, he prepared to make the riskiest, most brazen gamble of his career — even eclipsing the decision to run the Colorado. One of his greatest intrinsic strengths lay in realizing that opportunity so often arises out of good timing. The timing now — with the survey consolidation in full press and congressional discussion bubbling away— offered an optimal chance to take hold of the narrative and change its course. The report he would release was nothing less than explosive. He would reach far beyond his own survey work, indeed push so far beyond the bounds of a federal bureaucrat as to astound observers, seeming to shoulder the whole American experiment and bear it westward.
While Hayden and Wheeler conducted their fieldwork during the summer of 1877, Powell had stayed home, working assiduously on a document that built on the ideas he had presented to the National Academy of Sciences the year before. His Report on the Lands of the Arid Region of the United States, delivered to Interior Secretary Schurz on April 1, 1878, would be monumental and astonishing, and, in the words of a respected mid-twentieth-century historian, “[o]ne of the most remarkable books ever written by an American.” Starting with Charles A. Schott’s meteorological observations, buttressed by Gilbert’s and Dutton’s ground measurements of water requirements necessary for irrigation, Powell presented a formal, prescriptive plan for developing the West. In this report he integrated a lifetime of thought and observation, ranging from his childhood experiences in the Wisconsin grain fields to his close study of Mormon irrigation techniques, and informed by the network of ancient Pueblo canals and customs of Mexican water sharing. The thousands of miles he had walked, ridden, and climbed in the West keenly but invisibly shaped the document. At its core lay the realization battered into him on his first journey down the Colorado about humanity’s impermanence in the face of geologic time and how the Earth remained in a continual state of flux. It was more manifesto than scientific report, many of its conclusions based on incomplete evidence, much of the data hardly better than educated guesses.
Yet the conclusions have since proved ecologically sound and indeed remarkably spot-on. The report opened with a lengthy appraisal of the topography of the American West, including estimates of the amount of potentially irrigable land, timberland, and pasturage, before launching into a full-frontal assault on the current land-grant system, still rooted in the 1862 Homestead Act’s stipulation that any American adult could receive 160 acres, contingent upon demonstrating an ability to live on the land and improve it. While that system might work well in Wisconsin or Illinois, Powell argued, the arid West could not successfully support 160-acre homesteading. Those westgoers flocking into the arid lands beyond the 100th meridian would see their dreams dashed by spindly crops. Powell had directly contradicted Gilpin’s soaring promises. America could not have everything it wanted.
Powell’s recommendations focused first on classifying lands, then directing their use accordingly: Low-lying lands near water that were west of the 100th meridian should be available in 80-acre lots, while water-limited areas should be parceled into 2,560-acre units for pasturage. High mountain tracts under an abundance of timber should be made available to lumbermen.
He did not deny that drylands could be redeemed, but the limiting factor, as he noted before, was water. Irrigation could “perennially yield bountiful crops,” but the West contained few small streams that could be diverted by canal to fields, and those available were already being exploited to the limit in Utah and Arizona. Such large rivers as the Colorado ran through deep chasms and hostile ground, mostly far from any potential cropland. Only “extensive and comprehensive” actions — dams and distribution systems — could deliver the water, and only those with the means to undertake the task — not individual farmers, being poor men — could pursue it. If not carefully planned, wrote Powell, the control of agriculture would fall into the hands of water companies owned by rich men, who would eventually use their considerable power to oppress the people. He painted a truth that still rankles many today who believe in the myth of the rugged, independent westerner. He asserted that the development of the western lands depended not so much on the individual landowner as on the interdiction of the federal government, the only entity that could survey and map the land, build dams and other reclamation projects, administer vast swaths of public lands, oversee federal land grants, and tackle the displacement of the indigenous peoples. The lone cowboy taming the land with lasso and fortitude may fit the myth of the West, but the reality was quite different. Put simply, the West’s aridity required that overall public interest trump that of the individual.
The man who had previously limited himself to describing the topographic and geologic formations of the western lands had now waded directly into populist politics, driven by isohyets and tables of rainfall-per-acre statistics. Powell believed that the very republican dream of the small farmer was at risk under the crushing power of monopolistic interest. Such resistance aligned with his core childhood beliefs. He had seen the local grain operator in Wisconsin abuse powerless farmers with impunity. The stakes, as he saw them, were of the highest order, threatening the country’s very fulfillment. With the Arid Lands report, Powell had taken on not only Hayden and his congressional supporters, Wheeler and the army but also the General Land Office, the railroads, and the likes of William Gilpin — an overwhelming front of entrenched beliefs, myths, and nation-building passion, the very patrimony of Manifest Destiny. He had taken a hard shot directly at virtually unchallengeable assumptions about the unlimited wealth of American resources and the bright future of the great West — and also at who would have access to whatever wealth the West had to offer.
Powell saw that arid cultures stood or fell — and mostly fell — not on their absolute amounts of water, but on how equitably political and economic systems divided limited resources — and could evolve in the face of climatic and societal changes. To Powell, the Homestead Act, which imposed an arbitrarily eastern 160-acre parcel regardless of topography, rainfall, nearness to water, altitude, and other critical factors, appeared the height of folly, the blind, reflexive policy of a nation with outsized optimism drunk on the seemingly infinite resources available to it. Above all, he argued that the nation’s trustees needed to listen to the land itself — and respond accordingly.
Two days after Powell submitted his Arid Lands report to Schurz, the interior secretary forwarded it along to the House, which ordered 1,800 copies printed. After exhausting that print run quickly, another 5,000 copies printed afterward disappeared equally fast.
***
The Academy committee incorporated much of Powell’s report into their own, nevertheless watering it down considerably by passing over ethnology and his ideas about engineering the landscape. They recommended that the General Land Office’s surveyor generals, along with the three current federal surveys of Hayden, Wheeler, and Powell, be subsumed under two civilian-run agencies in the Interior Department. All land-measurement operation would fall under the Coast and Interior Survey, while all investigations of geology and natural resources, together with land classification, should fall under a new consolidated geological survey. It also recommended that the president appoint a blue-ribbon commission to investigate public-land laws in order to create a new land-parceling system in the arid West, where traditional homesteading was both impractical and undesirable.
On November 6, 1878, the entire Academy approved the report with only one dissenting vote, that of Marsh’s bitter rival Cope. Powell focused next on the congressional backlash that the Academy’s report would surely elicit. After all, it cut out the War Department—and diminished the power of the General Land Office’s sixteen surveyors general and their contractors. And then, of course, Hayden remained capable of hijacking all Powell’s work.
Powell launched a major lobbying effort, calling upon Newberry and Clarence King in late November to sway congressional opinion away from army management of the surveys. Ten days before the Academy presented its report to Congress on December 2, Powell decided not to seek the directorship of the new consolidated survey that Congress would most likely authorize. His deputy Clarence Dutton had written a friend ten days earlier with news that his boss “renounces all claim or desire or effort to be the head of a united survey.” A close observer much later wrote that “no one episode illustrates more strongly the character of the man—to pass voluntarily to another the cup of his own filling when it was at his very lips.”
Noble sentiments may have in fact prompted Powell to step aside, but sheer fatigue with the political infighting could also have played a factor. But Powell had also grown shrewd in politics, anticipating full well that as architect of the survey and land-office reform approach, he would feel the wrath of the vested interests. A general awareness that he was seeking to take the directorship might put the whole endeavor at risk. He now carried great ambitions for two mighty unfolding powers—the nation and science—but not comparable ambitions for his own wealth, power, or glory. When fame came, as it had with the descent of the Colorado, he would harness it to help overcome his next challenge, not to leverage into higher speaking fees, a larger house, or political office. His distaste for self-aggrandizement embodied the Wesleyan requirement of modesty. Work done was for God’s glory, not the individual’s. While Powell worshipped at a different altar, his work, not himself, remained the center of his life. But that did not mean he had stopped fighting to get someone installed to carry on the mission of science in good form.
In his eyes, Hayden had come to stand for the culture of Grant-era corruption after the war. Hayden’s often shoddy science, Powell believed, sent the interests of the United States squarely in a damaging direction. Hayden’s ascent to the position of senior federal scientist would doom land-grant reform. With his willingness to play up to senators and his suspect optimism about the unlimited possibilities of the West, Hayden stood flatly in the way of Powell’s struggle to open minds as to what the West actually offered. In this contest, Powell felt that nothing less than democracy lay on the line.
When Congressman James Garfield asked Powell’s opinion of Hayden’s integrity as a scientist, the Major responded blisteringly that Hayden was “a charlatan who has bought his way to fame.” He was a “wretched geologist” who “rambled aimlessly over a region big enough for an empire,” shamelessly attempting to catch the attention of “the wonder-loving populace.”
Nor had Hayden stood idly by when Congress called upon the National Academy for an opinion: “I presume some great plan will be proposed that will obliterate the present order of things,” Hayden wrote a friend, “unless all our friends take hold and help.” In another letter Hayden told Joseph Hooker that “Hon. Abram Hewitt is an enemy of mine. . . . We had a hard time this last session and came near being decapitated. . . . We had to cultivate the good will of over 300 members to counteract the vicious influence of the [Appropriations] Committee.” Hayden had lobbied members of the Academy to keep John Strong Newberry off the committee. Clarence King topped Powell’s list to run a consolidated survey.
King lived in New York, comfortable with seeking his own fortune and happily above the fray as Hayden, Wheeler, and Powell battled it out. He would do little to seek the directorship, but would be only too happy to accept it if offered. On the other side, Hayden launched a forceful letter-lobbying campaign. Unbeknownst to others, he had begun to suffer the effects of syphilis, very likely contracted from his frequenting of prostitutes. The disease, which would kill him nine years later, had already begun to cloud his judgment. His letter writing, however, appeared to be working. Again Powell countered with more lobbying of his own. In early January, Marsh received a letter from Clarence King, letting him know that King felt it was time to submit his credentials for the job.
Hayden still saw Powell as his major competitor, until when—in the middle of January—a friend notified him of Powell’s withdrawal; ten days later, Hayden wrote a friend that “all looks well now.” Of all the national surveyors, Hayden had published the most, had received more appropriations, and had more friends in Congress—and indeed had the bright feather of Yellowstone in his hat. The directorship was his to lose.
In late December, Powell had finished drafting the legislation that Schurz had requested to turn the Academy’s proposals into law. Powell cleverly tied three of the four proposals to appropriations bills, clearly intending to skirt the Public Lands Committee, crowded with western congressmen who would never allow such issues a hearing. Schurz forwarded them to John Atkins, the chair of the House Appropriations Committee, as well as to Abram Hewitt, the committee’s most influential member. Both strongly supported the measures. Atkins waited until February 10 to open congressional discussion, whereupon several weeks of vigorous debate ensued. Powell kept at work behind the scenes as a very public debate churned over the role of the federal government in the still largely undefined areas of science. He detailed his staff to bring Garfield books from the Library of Congress so he could cogently draft his position against proposed changes by General Humphreys and the Topographical Engineers.
The former Kansas shoe merchant, Representative Dudley C. Haskell, scoffed at federal dollars going to scientists collecting “bugs and fossils” and creating “bright and beautiful topographical maps that are to be used in the libraries of the rich.” Why would Congress reach into public coffers to pay these dubious scientists exorbitant sums to study the public lands? Other opponents of the Academy’s plan argued that the western public domain embraced much fine agricultural land. The West, the Montana newspaperman Martin Maginnis joyfully expounded, “contains in its rich valleys, in its endless rolling pastures, in its rugged mineral-seamed mountains, traversed by thousands of streams clear as crystal and cold as melting snow, all the elements of comfort, happiness, and prosperity to millions of men.” One congressman after another fumed at anyone so fainthearted as to criticize the extraordinary promise of the West. The “genius of our people,” wrote Representative John H. Baker of Indiana, was that they were “bold, independent, self-reliant, full of energy and intelligence,” who “do not need to rely on the arm of a paternal government to carve out their won fortunes or to develop the undiscovered wealth of the mountains.” Then he came to his real point: “I do not want them in their anxiety to perpetuate those or any other scientific surveys to interfere with our settlers upon the frontier.”
With Powell’s finger marks all over the Academy recommendations—much clearly pulled from his Arid Lands report—he now came under direct fire. Thomas Patterson, a former trial lawyer from Colorado, rose to decry Powell as a dangerous revolutionary, “this charlatan in science and intermeddler in affairs of which he has no proper conception.” Atkins’s proposal, he continued, was the work of one man, and threatened the West and its landed interests with disaster. Should Congress enlarge the land grants for grazing, then baronial estates would soon crowd the plains, an aristocratic few owning lands sufficient for a European principality and crowding out the small farmer upon which the nation depended. Powell must have been galled when the floor debate took this particular twist, especially when he had so consciously dedicated his efforts toward supporting the interests of the small farmer and preventing the aggregation of land and power that Patterson railed against. Patterson himself would go on to buy the Rocky Mountain News, making it a bullhorn for labor rights and the taming of corporate overreach. Indeed both men did not diverge much in their views. But at the heart of the matter lay a considerable foundational debate about who should be shaping the development of agricultural America and how much the government and scientific elite should be involved.
On February 18, 1879, Representative Horace Page of California offered a compromise that agreed to the consolidation of the scientific surveys but made no mention of reforming the land-survey system. Representative Haskell read a letter from a National Academy scientist, which submitted that the Academy debate was actually far more divisive than the one dissenting vote might indicate. The congressman would not reveal the letter’s author, most probably E. D. Cope, the missive a ploy by Hayden’s people to sow doubt about the Academy’s recommendations.
Atkins amended Page’s compromise to include the creation of a commission to investigate the land-grant system. The measure passed 98 to 79. The approved Sundry Bill went to the Senate, where no discussion took place. In the Appropriations Committee, Hayden’s supporters weighed in strongly, the committee amending the bill so that the scientific surveys were consolidated under Hayden, even taking $20,000 from Powell to finish up his work and giving it to Hayden. The bill then passed to conference committee. When it emerged on March 3, the last day of the session, the Senate’s emendations placing Hayden in charge had been cut out, but so had the House reformers’ bid to place all the competing agencies under the Interior Department. The last-minute collection of appropriation bills to keep the government functioning passed and the 45th Congress closed.
Hayden may well have considered this outcome a victory, the Senate indicating its interest in his running the consolidated survey. All he needed now was to take the directorship. But he had not counted on Powell. The Major did not delay, writing at length to Atkins on March 4, pinning blame on Hayden for negatively influencing the tenor of the congressional discussion by raising false issues solely to advance himself personally. Powell then revealed his deepest concern: The appointment of Hayden would effectively end efforts to reform the system of land surveys. He asked Atkins to approach Schurz and President Hayes to obstruct Hayden’s bid and to sing the praises of King.
Two days later, Powell spoke with the president, Hayes questioning him in particular on Hayden’s methods of securing appropriations. Powell also wrote a lengthy letter to Garfield, furnishing him with a withering analysis of Hayden’s published work. He did not hold back, claiming that Hayden’s mind was utterly untrained and incoherent, leading him to fritter away federal money on work “intended purely for noise and show.” Powell also worked closely with O. C. Marsh, helping to coordinate the flow of letters in support of King. Marsh traveled to Washington and also met with the president.
Cope wrote Schurz in support of Hayden, claiming that “simply shameful” personal grudges had aroused the voices against his friend. As for King, Cope insinuated that his tenure in government service had been sullied by his taking fees from mining enterprises. But Cope’s letter could not stem the tide of questions raised against Hayden. King’s nomination was officially announced on March 20. “My blood was stirred,” wrote Hayden supporter and Brown University president Ezekiel G. Robinson, upon hearing the news. “There must have been some dexterous maneuvering to have brought about a change in the President’s mind.”
The Senate approved King’s nomination with the slightest opposition on April 3. Three days later Marsh wrote Powell, “Now that the battle is won we can go back to pure Science again,” then invited him and Gilbert to present papers to the upcoming National Academy annual meeting. When Powell told King he would be pleased to work for the new United States Geological Survey, King responded exuberantly. “I am more delighted than I can express. Hamlet with Hamlet left [out] is not to my taste. I am sure you will never regret your decision and for my part, it will be one of the greatest pleasures to forward your scientific work and to advance your personal interest.”
King did not last two years on the job.
Waiting in the wings would be John Wesley Powell, who would take over the directorship of the USGS, run it for 13 years, and fundamentally shape the role of science in the federal government.
***
From The Promise of the Grand Canyon by John F. Ross, published by Viking, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2018 by John F. Ross.
On Saving the Cuban Crocodile from American Invasion

At Hakai Magazine, Shanna Baker reports on the ongoing bid to preserve C. rhombifer, the breed of Cuban crocodile beloved of Fidel Castro, who was known to send living and embalmed versions of the animal to allies around the world. The Cuban croc is endangered, not only due to shrinking habitat, but also to hybridization as its gene pool gets polluted by natural encounters with the bigger, shyer American crocodile.
Beside a spit of land jutting into a swampy enclosure, a female crocodile breaks the waterline, the bony ridges on her back jagged like an electrocardiogram. Her eyes track six sweat-soaked men standing in a haphazard semicircle, gripping poles twice their own height, as mosquitos orbit their straw hats. Another man works quickly with a hoe, leveling the dried grasses of her nest and chewing up the earth until he finds her unborn brood, laid just three days ago. The crocodile thrashes and lunges forward, but two men raise their weapons, ready to deliver a hard thump to the snout if she approaches.
She sinks back as the man in the middle of the mob loads her few dozen eggs plus a second set from a nearby nest into a plastic pail, cushioning them between layers of dirt. At the top, he places four last eggs—the rejects—each the size of a small mango. They feel like unpolished marble and all bear a sizable dent. The tiny would-be Cuban crocodiles (Crocodylus rhombifer) inside are goners—the membranes are too damaged—but the others are destined for an incubation room, where air conditioners humming round the clock will hopefully hold them at a steady temperature. If all goes as planned, in 75 days or so, hatchlings will emerge and help move the needle on C. rhombifer’s prospects for survival.
Conserving the Cuban croc was one of Fidel Castro’s first priorities after he steamed into power in 1959. Just months into his rule, he ordered the creation of the Criadero de cocodrilos, Ciénaga de Zapata—or Zapata Swamp Captive Breeding Facility—a cluster of ponds, rows of concrete-block pens, and a couple of narrow one-story buildings split into modest offices and workspaces for staff two and a half hours south of Havana. Castro always had a predilection for wild spaces and things, says environmental historian Reinaldo Funes-Monzote of the University of Havana. Whether he cherished endemic species because they fit with his hypernationalistic sensibilities, or he related to their untamed energy, or he was just enlightened to the inherent value of wildlife is a guess, though crocodiles must have become a point of pride for him at some stage—he eventually developed a habit of gifting them, either living or embalmed, to foreign allies.
The Cuban is bolder and hunts during the day. It has a stubby snout, a reputation for jumping, and a tendency to walk with its belly high off the ground. The American is bigger, more apt to hide, searches for prey at night, sports dark bands on its back and sides, and has a long, pointed snout and extra webbing on its hind toes. The differences are as distinct as red from blue.
Seeking a Roadmap for the New American Middle Class

Livia Gershon | Longreads | March 2018 | 8 minutes (1,950 words)
Over the past few months, Starbucks, CVS, and Walmart announced higher wages and a range of other benefits like paid parental leave and stock options. Despite what the brands say in their press releases, the changes probably had little to do with the Republican corporate tax cuts, but they do reflect a broader economic prosperity, complete with a tightening a labor market. In the past couple of years, real wages hit their highest levels ever, and even the lowest-paid workers started getting raises. As Matt Yglesias wrote at Vox, “for the first time in a long time, the underlying labor market is really healthy.”
But it doesn’t feel that way, does it? From the new college graduate facing an unstable contract job and mounds of debt to the 30-year-old in Detroit picking up an extra shift delivering pizzas this weekend, it just seems like we’re missing something we used to have.
In a 2016 Conference Board survey, only 50.8 percent of U.S. workers said they were satisfied with their jobs, compared with 61 percent in 1987 when the survey was first done. In fact, job satisfaction hasn’t come close to that first reading in this century. We’re also more anxious and depressed today than we’ve been since the depths of the recession, and we’re dying younger — particularly if we’re poor.
So maybe this is a good moment to stop and think about what really good economic news would look like for American workers. Imagine for a moment that everything goes right. The long, slow recovery from the Great Recession continues, rather than reversing itself and plunging us back into high unemployment. Increased automation doesn’t displace a million truck drivers but creates new, more skilled driving jobs. The retirement of the Baby Boomers reduces labor supply, driving up wages at nursing homes, call centers, and the rest of the gigantic portion of the economy where pay is low.
Would this restore dignity to work and a sense of optimism to the nation? Would it bring back the kind of pride we associate with the 1950s GM line worker?
I don’t think it would. I think it would take far more fundamental changes to win justice for American workers. But I also think it’s possible to strive for something way better than the postwar era we often remember as a Golden Age for workers.
Let’s start by dispelling the idea that postwar advances for American workers were some kind of natural inevitability that could never be replicated today. Yes, in the 1940s, the United States was in a commanding position of economic dominance over potential rivals decimated by war. And yes, companies were able to translate the manufacturing capacity and technological know-how built up through the military into astounding new bounty for consumers. But, when it comes to profitability, business has also had plenty of boom times in recent decades, with no parallel advances for workers.
This is the moment to stop and think about what really good economic news would look like for American workers.
Let’s also set aside the nostalgia about how we used to make shit in this country. Page through Working, Studs Terkel’s classic 1972 book of interviews with a broad range of workers, and factories come across as a kind of hellscape. A spot welder at a Ford plant in Chicago describes standing in one place all day, with constant noise too loud to yell over, suffering frequent burns and blood poisoning from a broken drill, at risk of being fired if he leaves the line to use the bathroom. “Repetition is such that, if you were to think about the job itself, you’d slowly go out of your mind,” he told Terkel.
The stable, routine corporate office work that also thrived in the postwar era certainly wasn’t as unpleasant as that, but there’s a whole world of cultural figures, from Willy Loman to Michael Scott, that suggest it was never an inherent font of meaning.
The fact that the Golden Age brought greater wealth, pride, and status to American workers, both blue- and white-collar, wasn’t really about the booming economy or the nature of the work. It was a result of power politics and deliberate decisions. In the 1930s and ‘40s, unionized workers, having spent decades battling for power on the job, at severe risk to life and livelihood, were a powerful force. And CEOs of massive corporations like General Motors were scared enough of radical workers, and hopeful enough about the prospects of shared prosperity, to strike some deals.
A consensus about how jobs ought to work emerged from these years. Employers would provide decent pay, health insurance, and pensions for large swaths of the country’s workers. The federal government would build a legal framework to address labor disputes and keep corporate monopolies from getting out of control. Politicians from both parties would march in the Labor Day parade every year, and workers would get their fair share of the new American prosperity.
Today, of course, the postwar consensus has broken down. Even if average workers are making more money than we used to, the gap between average and super-rich makes us feel like we’re getting nowhere. We may be able to afford iPhones and big-screen TVs, but we’ve got minimal chances of getting our kids into the elite colleges that define the narrow road to success.
And elite shows of respect for workers ring more and more hollow. Unions, having drastically declined in membership, no longer have a seat at some of the tables they used to. Politicians celebrate businesses’ creation of jobs, not workers’ accomplishment of necessary and useful labor. A lot of today’s masters of industry clearly believe that workers are an afterthought, since robots will soon be able to do anyone’s jobs except theirs.
But let’s not get too nostalgic about the Golden Age. As many readers who are not white men may be shouting at me by this point, there was another side to these mid-century ideas about work. The entire ideological framework defining a job with dignity was inextricably tied up with race and gender.
From the start of the industrial revolution, employers used racism to divide workers. And union calls for respect and higher wages were often inseparable from demands that companies hire only white men. The Golden Age didn’t just provide white, male workers with higher wages than everyone else but also what W.E.B. Du Bois called the “public and psychological wage” of a sense of racial superiority.
Just as importantly, white men in the boom years also won stay-at-home wives. With rising male wages, many white women — and a much smaller number of women of other races — could now focus all their energy on caring for home and family. For the women, that meant escape from working at a mill or cooking meals and doing laundry for strangers. But it also meant greater economic dependence on their husbands. For the men, it was another boost to their living standard and status.
Golden Age corporate policies, union priorities, and laws didn’t create the ideal of the white, breadwinner-headed family, but they did reinforce it. Social Security offered benefits to workers and their dependents rather than to all citizens, and excluded agricultural and domestic workers, who were disproportionately black. The GI Bill helped black men far less than white ones and left out most women except to the extent that their husbands’ benefits trickled down to them.
Let’s also set aside the nostalgia about how we used to make shit in this country.
Today, aside from growing income inequality, unstable jobs, and the ever-skyward climb of housing and education costs, a part of the pain white, male workers are feeling is the loss of their unquestioned sense of superiority.
So, can we imagine a future Golden Age? Is there a way to make working for Starbucks fulfill all of us the way we remember line work at GM fulfilling white men? Maybe. With an incredible force of political will, it might be possible to rejigger the economy so that modern jobs keep getting better. It would start with attacking income inequality head-on. The government could bust up monopolistic tech giants, encourage profit-sharing, and maybe even take a step toward redistributing inherited wealth. We’d also need massive social change to ensure people of color and women equal access to the good new jobs, and men and white people would need to learn to live with a loss of the particular psychological wages of masculinity and whiteness.
But even all that would still fail to address one thing that made work in the Golden Age fulfilling for men: the wives. Stay-at-home moms of the mid-twentieth century weren’t just a handy status symbol for their men. They were household managers and caregivers, shouldering the vast majority of child-raising labor and creating a space where male workers could rest and be served. And supporting a family was a key ingredient that made otherwise draining, demeaning jobs into a source of meaning.
Few men or women see a return to that ideal as a good idea today. But try imagining what good, full-time work for everyone looks like without it. Feminist scholar Nancy Fraser describes that vision as the Universal Breadwinner model — well-paid jobs, with all the pride and status that come with them, for all men and women. She notes that it would take massive spending to outsource childcare and other traditionally unpaid “female” work — particularly since those jobs would need to be good jobs too. It would also leave out people with personal responsibilities that they couldn’t, or wouldn’t, hand over to strangers, as well as many with serious disabilities. And it certainly wouldn’t solve the problem many mothers and fathers report today of having too little time to spend with family.
A really universal solution to the problem of bad jobs would have to go beyond “good jobs” in the Golden Age model. It would be a world where we can take pride in our well-paid jobs at Starbucks without making them the center of our identities. That could mean many more part-time jobs with flexible hours, good pay, and room for advancement. It could mean decoupling benefits like health care and retirement earnings from employment and providing a hefty child allowance. Certainly, it would mean a social and psychological transformation that lets both men and women see caring work, and other things outside paid employment, as fully as valuable and meaningful as a job.
As a bonus, this kind of solution would also make sense when we do fall back into recession, or if the robots do finally come for a big chunk of our jobs.
All this might sound absurdly utopian. We are, after all, living in a world where celebrity business leaders claim to work 80-plus hour weeks while politicians enthusiastically deny health care to people who can’t work.
But the postwar economy didn’t happen on its own. It was the product of a brutal, decades-long fight led by workers with an inspiring, flawed vision. And today, despite everything, new possibilities are emerging. Single-payer health care is a popular idea, and “socialism” has rapidly swung from a slur to a legitimate part of the political spectrum. Self-help books like The 4-Hour Work Week — which posit the possibility of a radically different work-life balance, albeit based on individual moxie rather than social change — have become a popular genre. Young, black organizers in cities across the country are developing their own cooperative economic models. And if there’s any positive lesson we can take from the current political moment, it’s that you never know what could happen in America. Maybe a new Golden Age is possible. It’s at least worth taking some time to think about how we would want it to look.
***
Livia Gershon is a freelance journalist based in New Hampshire. She has written for the Guardian, the Boston Globe, HuffPost, Aeon and other places.
How Lobbyists Normalized the Use of Chemical Weapons on American Civilians

Anna Feigenbaum | An Excerpt from: Tear Gas: From the Battlefields of WWI to the Streets of Today | Verso | November 2017 | 22 minutes (6,015 words)
* * *
Just as some in Europe argued that chemical weapons were a mark of a civilized society, for General Fries war gases were the ultimate American technology.
With his thick moustache and piercing, deep-set eyes, General Amos Fries’s passion shone through as he spoke. In a 1921 lecture to military officers at the General Staff College in Washington, DC, Fries lauded the Chemical Warfare Service for its wartime achievements. The US entered the chemical arms race “with no precedents, no materials, no literature and no personnel.” The 1920s became a golden age of tear gas. Fries capitalized on the US military’s enthusiastic development of chemical weapons during the war, turning these wartime technologies into everyday policing tools. As part of this task Fries developed an impressive PR campaign that turned tear gas from a toxic weapon into a “harmless” tool for repressing dissent.
Manufacturers maneuvered their way around the Geneva Protocol, navigating through international loopholes with ease. But these frontier pursuits could not last forever. The nascent tear gas industry would come to face its biggest challenge yet, in the unlikely form of US senators. In the 1930s two separate Senate subcommittees were tasked with investigating the dodgy sales practices of industrial munitions companies and their unlawful suppression of protest.
General Fries’s deep personal commitment to save the Chemical Warfare Service won him both allies and critics, often in the same breath. Already known for his staunch anticommunism and disdain for foreigners of all kinds, Fries was an unapologetic proponent of military solutions for dissent both at home and abroad. A journalist for the Evening Independent wrote that Fries was often “accused of being an absolute militarist anxious to develop a military caste in the United States.” But to those who shared his cause, Fries was an excellent figurehead for Chemical Warfare. A family man, a dedicated soldier, and a talented engineer, Fries was the perfect face of a more modern warfare.
Just as some in Europe argued that chemical weapons were a mark of a civilized society, for General Fries war gases were the ultimate American technology. They were a sign of the troops’ perseverance in World War I and an emblem of industrial modernity, showcasing the intersection of science and war. In an Armistice Day radio speech broadcast in 1924, Fries said, “The extent to which chemistry is used can almost be said today to be a barometer of the civilization of a country.” This was poised as a direct intervention to the international proposal for a ban on chemical weapons, as preparations for the Geneva Convention were well under way. If chemical weapons were banned, Fries knew it would likely mean the end of the CWS—and with it his blossoming postwar career. Read more…
You must be logged in to post a comment.