Search Results for: atlantic

Happy, Healthy Economy

Francesca Russell / Getty

Livia Gershon | Longreads | August 2018 | 8 minutes (2,015 words)

In 1869, a neurologist named George Beard identified a disease he named neurasthenia, understood as the result of fast-paced excess in growing industrial cities. William James, one of the many patients diagnosed, called it “Americanitis.” According to David Schuster, the author of Neurasthenic Nation (2011), symptoms were physical (headaches, muscle pain, impotence) and psychological (anxiety, depression, irritability, “lack of ambition”). Julie Beck, writing for The Atlantic, observed that, among sufferers, “widespread depletion of nervous energy was thought to be a side effect of progress.”

Recently, there have been a number of disconcerting reports that one might view as new signs of Americanitis. A study by the Centers for Disease Control found that, between 1999 and 2016, the suicide rate increased in nearly every state. Another, from researchers at the University of Michigan, discovered that, over the same period, excessive drinking, particularly among people between the ages of 25 to 34, correlated with a sharp rise in deaths from liver disease. A third, by University of Pittsburgh researchers, suggests that deaths from opioid overdoses, recognized for years as an epidemic, were probably undercounted by 70,000.

Read more…

The Sexist Trials of Female Attorneys

CSA Archives / Getty

The climate of a courtroom puts female attorneys on trial for their gender instead of allowing them to do their job. Lawyer and professor Lara Bazelon discusses in The Atlantic the pernicious discrimination female lawyers face in the courtroom, including constant degradation and impossibly constricting ‘rules.’

Sexism infects every kind of courtroom encounter, from pretrial motions to closing arguments—a glum ubiquity that makes clear how difficult it will be to eradicate gender bias not just from the practice of law, but from society as a whole.

I hated pantyhose, both the cringe-inducing word and the suffocating reality. They itched miserably and ripped. But showing up in federal court with bare legs was as unthinkable as showing up drunk.

I was practicing law differently from many of my male colleagues and adversaries. They could resort to a bare-knuckle style. Most of what I did in the courtroom looked more like fencing. Reading over my old trial transcripts, I am taken aback by how many times I said “Thank you”—to the judge, to opposing counsel, to hostile witnesses. And by how many times I apologized.

Embracing traits traditionally associated with women seems to pay off particularly well in litigation involving so-called women’s issues. In many of these cases, female trial lawyers are favored and even actively recruited. In the civil arena, for example, women have thrived in high-stakes medical-malpractice lawsuits where the plaintiff claims that the defendant’s product injured her genitalia or reproductive organs.

Read the story

An Igbo Slaver’s Descendants Reckon With History

LONDON - MARCH 29: HMS Northumberland (L) escorts a replica 18th century wooden square rigger ship 'The Zong' (R) on a choppy River Thames on March 29, 2007 in London, England. Today's events form part of the 200th anniversary of the abolition of the slave trade act. The Zong was at the centre of a court case in 1783, after 133 slaves were thrown overboard in an insurance scam. The resulting public outrage led to the rise of the Abolitionist movement. (Photo by Peter Macdiarmid/Getty Images)

For the New Yorker, author  dives into the history of her great-grandfather, an Igbo slave trader and palm merchant who participated in the transatlantic slave trade. “African intellectuals tend to blame the West for the slave trade, but I knew that white traders couldn’t have loaded their ships without help from Africans like my great-grandfather,” the author writes. She reveals a complex caste system that pre-dated European influence, and shows how current generations of her family are accounting for their ancestor’s relationship to the suffering of many.

On the first day of the fast, members of my family met in small groups in London, Atlanta, and Johannesburg. Some talked on the phone, and others chatted on social media. Thirty members gathered under a canopy in my parents’ yard. With tears in his eyes, my father explained that, in Nwaubani Ogogo’s day, selling and sacrificing human beings was common practice, but that now we know it to be deeply offensive to God. He thanked God for the honor and prestige bestowed on our family through my great-grandfather, and asked God’s forgiveness for the atrocities he committed. We prayed over a passage that my father texted us from the Book of Psalms:

Who can understand his errors?
Cleanse me from secret faults.
Keep back Your servant also from presumptuous sins;
Let them not have dominion over me.
Then I shall be blameless,
And I shall be innocent of great transgression.

During the ceremony, I was overwhelmed with relief. My family was finally taking a step beyond whispering and worrying. Of course, nothing can undo the harm that Nwaubani Ogogo caused. And the ohu, who are not his direct descendants, were not invited to the ceremony; their mistreatment in the region continues. Still, it felt important for my family to publicly denounce its role in the slave trade. “Our family is taking responsibility,” my cousin Chidi, who joined from London, told me. Chioma, who took part in Atlanta, said, “We were trying to make peace and atone for what our ancestors did.”

On the final day, my relatives strolled along a recently tarred stretch of road to our local Anglican church. The church was established in 1904, on land that Nwaubani Ogogo donated. Inside, a priest presided over a two-hour prayer session. At the end, he pronounced blessings on us, and proclaimed a new beginning for the Nwaubani family. After the ceremony, my family members discussed making it a yearly ritual. “This sort of thing opens up the mercy of God,” my mother, Patricia, said. “People did all these evil things but they don’t talk about it. The more people confess and renounce their evil past, the more cleansing will come to the land.”

Read the story

 

Redlining in the Lap Lane

Red, white, and blue swimming pool lane divider
"Pools have historically been the sites of major feuds over race, income, and access." Olga Khazan for CityLab (Photo by Black 100/Getty Images)

In CityLabOlga Khazan revisits her hometown to ask residents in McKinney, Texas, how they’ve been faring since a 2015 viral video captured Eric Casebolt, a white police officer, using excessive force on Dajerria Becton, a black teenager, at an unauthorized pool party.

Khazan soon finds that tensions in the community are still running high three years later, and that the fallout tracks with how private club pools and homeowners’ associations have historically provided a cover for redlining.

The west has long been referred to as the “new” side, the “good” side, and sometimes the “white” side.

Builders have carved up the west side into sylvan subdivisions with names like Hidden Creek and Eldorado Lakes. The west-side neighborhoods are full of tidy lawns and brick homes. To combat the triple-digit heat that engulfs North Texas for much of the summer, they have swimming pools that are accessible only to residents.

On the east side, some homes are new or remodeled, but others are patched with plywood and corrugated metal. Eighty-six percent of the west side was white in 2009, when the city was forced to settle an affordable-housing lawsuit, compared with 49 percent of the east. The lawsuit claimed that all of the town’s public housing and most of the landlords willing to take Section 8 vouchers were on the east side.

The incident was perhaps especially incendiary because it involved a swimming pool: Pools have historically been the sites of major feuds over race, income, and access. As my colleague Yoni Appelbaum wrote in the wake of the McKinney incident, in the early 20th century, public pools were plentiful—but segregated. As civil-rights activists pushed to desegregate them, many cities privatized the facilities rather than be forced to integrate them. Private and exclusive pools became more common; public ones, less so. “Suburbanites organized private club pools rather than fund public pools because club pools enabled them to control the class and racial composition of swimmers, whereas public pools did not,” the historian Jeff Wiltse noted in his 2007 book, Contested Waters: A Social History of Swimming Pools in America.

Many of the homes on McKinney’s east side were built before homeowners’ associations began incorporating gated pools into their developments. People in McKinney who don’t belong to homeowners’ associations can use the city’s public swimming pools. There are four, and rather than operating on homeowners’-association dues, they charge a fee for admission. The newest pool, at a facility called the Apex Center, features water slides and costs $10 a person for a day pass. (It’s on the west side.) If Rhodes had wanted to host her party legally, she would have had to rent one of these pools. For up to 200 guests, the cost is $110 to $800 for two hours, depending on the pool.

“Craig Ranch is a multimillion-dollar development,” said Henry Moore, a pastor at Saint Mark Baptist Church, an old black church on the east side, whom I spoke with one Sunday last month before services began. “On the east side, there is no Craig Ranch multimillion-dollar development. So there will be nicer things on the west side than there are on the east side.”

When the socioeconomic divide in a town is so stark, the line between feeling unwanted because you’re not from the neighborhood and feeling unwanted because of your race can start to blur. “Are you saying I’m not supposed to be here because I don’t live here?” Moore continued, speculating on the mind-set of some of the teens that day. “But I was invited.”

Read the story

Smooth Spaces, Fuzzy Lives

Brian Lawless/PA Wire

Rachel Andrews | Brick | Summer 2018 | 18 minutes (4,831 words)

A photograph in an Irish newspaper depicts a member of the Garda Síochána shaking hands with his counterpart from the Police Service of Northern Ireland at one of the points where the territory of the Republic turns into that of Northern Ireland. The photograph, published in November 2015, seven months before Britain voted to exit the European Union, accompanies an article on plans for a “border corridor,” whereby police on both sides of the border can pursue fleeing criminals into each other’s region.

There’s a kind of joviality to the photograph: firm clasping of hands, big smiles. Behind the two men is the Irish landscape, rolling, misted, a river cutting through fields of green. The officers wear different uniforms, but the only obvious territorial demarcation, the only hint that they inhabit different countries, with different laws, health systems, and currencies, is a sharp change in road color, from black to sudden grey.

I remember this non-distinctiveness, the dawning awareness that I had crossed a boundary, from the many trips I took to Northern Ireland between 2007 and 2010, when I worked on an essay that documented the systematic demolition of the Maze prison, a story that presented itself symbolically and — as it turned out — all too simplistically as one of a settling of the past and a coming together for the future.

I never went North as a child. I remember a drawing in a newspaper depicting a map of Ireland. In the sliver of space that is Northern Ireland, the cartoonist had penned: “there be dragons.” In truth, it was worse than that. Ask me as a 6-year-old, a 12-year-old, about Northern Ireland and I would have responded: bombs and blood. Ask my young daughter today and she might look at you blankly. It means nothing to her, and that is a good thing.

There were ways it meant nothing to us too. I grew up in Cork, in the very south of Ireland, and that meant growing up a world away from bombs and blood. As children in the 1970s and 1980s, we were safe from soldiers in the back gardens, from streets we couldn’t walk down. But things filtered into our child worlds. From television: the dark loom of the watchtower, the helicopters, the aerial prison shots following the 1983 Maze escape; Gordon Wilson, who lost his 20-year-old daughter in the 1987 Enniskillen bombing: “I shall pray for those people tonight and every night.” Of the few discussions in school, I remember one: the classmate who had relatives in Belfast, and her upset, her anger, at our fear, our distancing and distaste.

As I got older and traveled in Europe, the easy comfort of that distancing — you and I are not alike — was undercut. “So where did you hide the bomb?” a French colleague joked when I worked for a summer at a hotel in Munich. “Until I met you, I thought all Irish people were savages,” a German girl told me during my Erasmus year in France. This was the early to mid-90s and everywhere I went, there it was. “We in Australia just can’t understand it,” said the visitor to my apartment in London. I still remember the insult of his bemusement and sincerity, as well as my own avoidance. As far as everyone on the outside was concerned, I was them and they were me. I knew better — I mean, was it not obvious?

The first time I went North was in 2000, two years after the Good Friday Agreement, after the Omagh bombing and the howl of anguish that went with it, after it became imaginable, almost normal, for me to drive in my tiny Southern-registered Fiat from Dublin to Belfast and back again as I researched a writing project on women working in politics in Northern Ireland. I was in my late 20s by then, and I wasn’t afraid in Belfast’s city center, which had the same familiar department store names as any British or Irish main street; nor on the Falls Road, which wrapped me in a warm blanket of tri-colors and Celtic symbols; but I felt heavy and intimidated as I made my way up the red, white, and blue pavements on Shankill Road to the offices of the Progressive Unionist Party, hesitant to speak in the corner store lest I betray myself through the soft spill of my Southern tones. But then this dissolved too, and seven years later, when I spent time interviewing former prison officers at the Maze, as well as the residents who had grown up beside the prison, all from a Unionist background, the sing of my Cork accent felt more like a benign curiosity than anything traitorous or threatening.

“Merging is dangerous,” writes Rebecca Solnit, “at least to the boundaries and definition of the self.” Is that why we wrestle against it so? The border with Northern Ireland, once a site of blocked roads and lookout towers, has evaporated, at least on the surface of things, but it remains a place of struggle, of contest, a tussle between those who wish to take it one way and those who would move in another direction, either within the boundaries of a unified Ireland or into the space of clarity that tells us where we end and they begin. The amorphous situation that has existed along the border for nearly twenty years, a fudge that has resisted discrete categorizations and that we seemed to have found a way to live with, or live within, is under pressure in the wake of the Brexit vote, as the clamor to once again define what we are and what we are not, begins to accelerate. We look for the solace of certainty, of knowing if we are one thing or the other, rather than allow ourselves to remain within the complicated, messy space of the both/and, a state made possible by the exhaustion left after thirty years of violence.

Hannah Arendt had a particular view of merging. As she searched out a meaningful concept of a Jew’s place in the world following the sundering caused by the Second World War, she ultimately rejected a form of Zionism that connected citizenship to ethnicity and tethered both to the boundaries of the nation-state. On the other hand, she wrote scathingly of those European Jews who would assimilate, who would ape the Gentiles in an effort to find their way into the ranks of the human, who would, she wrote in disgust, become “good Frenchmen in France,” “good Germans in Germany.” Arendt, you could say, had been one such good German. As a child she did not know that she or her family were Jewish; she learned of her ethnicity only through the anti-Semitic taunts of children on the street. But it was the shocking stripping of her German citizenship as an adult in the 1930s that ultimately woke her to the helpless vulnerability of the assimilated Jew and formed her conviction that Jews must stand defiantly aloof from the boundaries of nationality, turning instead toward the belonging of the citizen; the belonging that attaches to full and complete membership of a political community; the belonging that confers the right to meaningfully speak, act, and be heard in such a community; the belonging that means inhabiting a territory without subscribing to an overarching identity narrative. “Refugees driven from country to country represent the vanguard of their peoples — if they keep their identity,” she wrote. Today her sentiments do not appear so different from those of Dina Nayeri, an Iranian refugee who received U.S. citizenship at 15 and became a French national at 30, and who wrote in the Guardian that she had lost interest in the need to rub out her face as tribute for these benefactions. “As refugees, we owed them our previous identity. We had to lay it at their door like an offering, and gleefully deny it to earn our place in this new country. There would be no straddling. No third culture here,” she said, although a third culture appears to be the choice made by Arendt’s beloved Heinrich Heine, at least as she described it, which was to live as both a German and a Jew rather than deny his Jewishness as the price of belonging. “He simply ignored the condition which had characterized emancipation everywhere in Europe — namely, that the Jew might only become a man when he ceased to be a Jew,” she wrote.

Arendt came out of a Europe that had, she witnessed, conclusively intertwined national rights with human rights, which left her as mistrustful of a national, bordered identity as she was of the “abstraction” of any solemn notions of the inalienable rights of man. Heine, the Prussian-born poet and literary critic, came of age in the early nineteenth century, an era of political instability and contentiousness in his homeland; his conversion to Lutheranism was reluctant, regretted, and carried out only as the price of “admission into European culture.” In the early years of the twenty-first century, there was a feeling —I had the feeling — that Europe, at least on some parts of its continent, had found its way beyond these aspects of its shattering history and was on the turn toward the global and the flexible. In 2002, when I lived for a time in Paris, I could board a plane in France and emerge in Italy, where I could retrieve my bags and leave the airport without showing any identification, without queues or questions. This identity-less travel, the result of the then seven-year-old Schengen Agreement and so opposite to my conditioned, normalized experience of waiting in dutiful lines, gave me the very real sense of being a human in the world. The continent of Europe — the part of it that now had a common currency and permeable frontiers, and even onwards toward the rapidly opening East — felt magical, enlightened even, as if we were all in this together. The distinctions between us, forged through cultural, religious, and geographic experience, appeared shapeless now. I could be both Irish and European; I felt that I could, as Arendt wrote, “speak the language of a free man and sing the songs of a natural one.”

But there was a “them.” From my window in Stalingrad, the quartier in the north of Paris where I lived from September to Christmas, I watched men in jeans and jackets congregate outside in the early darkness of the winter evenings, lining up in huddled rows on a Friday for weekly prayers. I looked on, curious — what are they doing? — before I understood. This was one year before the Iraq War, which fractured the Arab world, but already and for long years it was not easy to be Muslim in France, even if you were the French-born descendant of those who had come in the 1950s and 1960s as part of the first wave of migrant workers from northern Africa who stayed in search of a better life; even if you identified as both Muslim and French, as really all, or at least so many, of such descendants do, and as French civic society, with its emphasis on the primacy of the citoyen, encourages — in theory anyway. The exclus, they used to call them. The excluded. If I lived in Stalingrad today, the men across the street would no longer be there; in 2011, politicians banned the saying of street prayers in Paris following far-right protests about creeping Islamization. Instead, near the street I lived on, under the bridge where the metro station lay, there would almost certainly be tents and other makeshift shelters constructed by refugees from Iraq, Syria, Afghanistan, and elsewhere, part of a new and different wave of migration that, along with the 2008 economic crisis, has upended all of Europe. In 2002, I also went to Greece on a reporting assignment. There was no graffiti then comparing Angela Merkel to Hitler; today many in the desperately-indebted country view the dominance of German capital as the source of their woes. In Italy, France, Germany, a radicalized electorate now supports nationalist parties, looks at the European Union with deep suspicion. We were never all in this together.

For the moment, I can travel from Ireland to Britain without a passport. For the moment, I can drive from Dublin to Belfast without stopping, as the road melts from the N1 to the A1 and the white and black sign informs me that speed limits are now being monitored in miles rather than kilometers. (How different to John McGahern’s experience in the early 1990s, recounted in the essay “County Leitrim: The Sky Above Us”: “There are ramps and screens and barriers and a tall camouflaged lookout tower,” he said of the border crossing at Swanlinbar in County Cavan. “A line of cars waits behind a red light. A quick change to green signals each single car forward. In the middle of this maze armed soldiers call out the motor vehicle number and driver’s identification to a hidden computer. Only when they are cleared will the cars be waved through. Suspect vehicles are searched. The soldiers are young and courteous and look very professional.”) By the time I will have finished writing this article, British Prime Minister Theresa May will have triggered Article 50, and the movements I have become used to taking between cities and countries will have been thrown into confusion. Since the terrorist attacks of November 2015, France has been in a state of emergency that includes a firm policing of its borders. For more than a year and a half, commuters travelling from Malmö in Sweden to Copenhagen in Denmark had to present their IDs. Temporary border controls have also been introduced by Germany, Austria, and Norway. Merging is dangerous. Those hoping for a united Ireland — and I am surely one — forget this. On his blog, the journalist and Northern analyst Andy Pollak notes that Andrew Crawford, the former special adviser to current Democratic Unionist Party (DUP) leader Arlene Foster, used to go through reports from one North-South body removing the phrase “all-Ireland.” Perhaps the action of deletion helped Crawford forge certainty, was part of an attempt to make sense of how he existed in time and space. Forging certainty helps us all as we construct both story and identity in order to figure out how to live, but certainty, or at least a fixed destination, gets us into trouble too: we blind ourselves to possibilities, to the creative potential that lies outside of the either/or, to what can happen when we follow Arendt, say, or Deleuze, that great demolisher of dualisms, into the space of the non-being, the uncertain, the becoming.

In her photographic series Kinderwunsch, Ana Casas Broda depicts her body in thrall to those of her children, an artist willing to lose herself in conversation with flux, with change, with overwhelm. The photos are intimate and direct. Casas Broda often stares unsmilingly at the camera: a candid, life-worn Olympia, her pregnant body naked and big, uncomfortable-looking with her second child, or scarred and slack following fertility treatment and birth. In one of the images, her children have marked her face and torso with crayon; she both encouraged this and passively accepted the results. “I am their canvas: they play with me and change me,” she said in an interview. Kinderwunsch means “desire to have children,” and Casas Broda submits, it appears to me, to the terror and the unknown of that primal desire. She tumbles downwards, inwards. In the photographs, her children clothe her in tissue paper, they cover her in Play-Doh. “I see their scribbles on my body as a symbol of how motherhood has changed me,” she said. What she is really depicting is dissolution (of a former self), symbiosis — and something else. In some of the images, she and her children appear as one, interwoven, but there are others where she is alone, or they are indifferent to her: a son plays a video game as she lies naked on a couch, in between mother and person, neither here nor there, her body nonetheless relaxed, strangely at ease in the moment.

Around the time I began my Maze project, I was experiencing the greatest disintegration of self I had ever felt. Crossing the border from North to South represented moments of enormous exhilaration and giddy freedom: dazed as I was, when I lay in a border hotel without the baby, who had just turned seven months, I thought that I could see a way back to myself, that the place where I ended and the child began, would somehow become obvious again, clearly defined. I was wrong about that: there was no going backward. There was no going forward either, at least not in the way I wanted or imagined. Since the birth of my daughter, I remain in limbo land, the borders of a self so carefully constructed over nearly four decades now shifting. She arrived and I disappeared, something like that anyway. The categories I had thought surrounded me have dissipated into confusion and nothingness, and that, if I think about it too much, can be terrifying. Did I turn into you, I used to ask her when she was a baby, or have you become me?

When Colm Tóibín walked the border between North and South in 1987, he bumped into questioning British soldiers; a blown-up bridge on a road that once led from Dublin to Enniskillen; and, in Derry, a march led by former DUP leader Peter Robinson, then in the ascendant. Tóibín feared opening his mouth during the march, lest the crowd (young men in the main, some drunk) spot him as a Southerner. Despite the disappearance of the island’s physical frontier, the hangover from these tensions remains. My friend, a middle-class Northerner from a Catholic background who has lived in Dublin for nearly twenty years, at times employs turns of phrase that leave me reaching for a Cockney rhyming slang dictionary. Yet she and I both use colloquialisms a person born in England will never have heard. Nonetheless, for a long time my friend was lost and lonely in Dublin, reluctant to move back to a society still undercut by a deep lack of trust but without solid ground in the cultural space of the South; she felt different. “I was different,” she told me, as I tried to grasp her feelings of statelessness. It’s not as if we are from different countries, I told myself, not really anyway. But the thing is, we are, both literally and metaphorically. The border has dissolved, but trauma, so deep, so wounding, cuts us off from one another, makes strangers of us in the same land, pulls me one way, pushes her another; trauma turns a society inward, and it has turned Northern Ireland, in the words of retired Oxford professor of Irish history Roy Foster, into more of its own little place than ever. What we have in common is this (and this is easy to write and hard to live): we are more the same than we are different.

The artist Rita Duffy grew up Catholic in a largely Protestant area of Belfast; she is the progeny of a Southern mother and a father whose own father, a Catholic from the Falls Road, died at the Somme. Her two great-uncles on her mother’s side supported and may have been actively involved in the 1916 rebellion, which ultimately led to Irish independence, a civil war, and the fracturing of the island of Ireland. “I was continually fluctuating between nationalities, between identities, curious to know could I somehow land up in the middle somewhere that satisfied me today,” she told a symposium I attended in 2016. In recent years, Duffy has established herself within the space of the liminal: “I crept out to the edges of Ulster and we bought a little piece of the border. We built a house and I now have a studio just a mile and a half on the Southern side, and I live a mile and a half on the Northern side, so I kind of live in neither-here-nor-there land, which is a really interesting place to be as an artist. It’s very confusing and out of that springs the best imaginative possibilities for me.” Out of those imaginative possibilities have arrived big, bold ideas. The Titanic passenger liner was built in Belfast; the tragedy of its unfulfilled promise can be viewed as a metaphor for the long years the North lost to violence. In 2005, Duffy founded Thaw, a company set up to fund the towing of an iceberg from the Arctic to Belfast, where it would be moored outside the city and allowed to melt, in the process encouraging the shrinking of the deep, frozen divisions that still exist within Northern Irish society. Duffy has not yet found the means to drag her iceberg to Belfast, but since 2003 her paintings have been replete with the mythology of those hulking, frozen structures. She has created figurative images that appear trapped, encased in ice: Father Edward Daly, crouching, waving his handkerchief; a close-up of an arm, gesturing, holding a white handkerchief that may itself be an iceberg in miniature; in another painting, there is a Pieta, a mother holding a dying son, both emerging out of the bulk of an ice structure.

Duffy paints these images in greys and yellows, sometimes browns or greens, always muted. But in the middle, or in the distance, there is something, a speck of brightness, a blob, the white-grey of Father Daly’s handkerchief-iceberg, the light that draws your eye and that looks and feels like a breath of gulping air. If you thaw the frozenness, if you let it melt into the Irish Sea, then a space can open up, the iceberg no longer blocks your view and holds you in its frozen time. Behind you lies the city, with its plurality of people, before you the sky and the vastness of the ocean, deep and bold and cerulean blue. Duffy’s iceberg queen, a mammoth, back-turned Victoria, ascends into that blue, the blue of space, the blue of a possibility that allowed for an impossible friendship during the short time that former IRA member Martin McGuinness and the once-trenchant defender of Unionism, Reverend Ian Paisley, worked together in government in Northern Ireland. If you thaw the frozenness, a space opens up, and into that space walked Ian Paisley Jr., son of the good preacher, on various radio stations in January 2017, offering “humble and honest thanks” to Martin McGuinness on the occasion of the latter’s retirement.

In the North Atlantic, the largest iceberg on record was measured at 550 feet above sea level, the same height as a 55-story building; less tremendous ice structures can still reach more than 200 feet high. The Titanic, travelling at top speed on a calm night, crashed into an iceberg that was more than a mile long and 100 feet high and had been growing into its dense mass of packed ice since the time of Tutankhamun, although once such an iceberg drifts from the Arctic to the warmer waters of the North Atlantic, which this one had, it will normally melt in two or three years. To an impatient human eye, this melting will be imperceptible until it is close to completion. My daughter likes to play with ice cubes; she takes them from her glass of water and lays them on the table, where she can contemplate their light, their translucence. When she first started this game, I watched benignly; these days I place a tissue or napkin on the table to soak up the water that spreads out so suddenly as the cube, whole only a moment before, turns liquid before our eyes.

In Bosnia, where I’ve been doing research, the iceberg is still solid, a mountainous whole that blocks ethnicities from seeing across to each other. The Bosnian peace deal of 1995 somehow managed to avoid the formation of literal borders; instead, the populace has retreated into different enclaves across the country, Muslims stick with Muslims, Serbs with Serbs, and so forth. The saddest example of this is Sarajevo, which now sits within the Federation of Bosnia and Herzegovina, one of two political entities that compose Bosnia. Sarajevo’s population is now almost 90 percent Muslim, many of them newcomers since the ending of the war; former residents, most of whom will never go back, mourn the city that was once multi-ethnic and cosmopolitan. Everything has changed, they will tell you, shaking their heads; the city is totally different. There is still separatist agitation, particularly in the Republika Srpska, the political entity that sits closest to Serbia proper, whose nationalist leaders threaten to form their own tiny state, but the frozen iceberg contains more than that: it holds the pain of deceit, of mistrust, of horrors, of loss, of history and geography, of denial and defense. Most of my time in Bosnia was spent in the Republika Srpska, in the east of the country, where I found myself crossing and recrossing the border with Serbia. At each crossing, I encountered the checkpoints: the wait, the documents handed over, the computer clearance, the questions on occasion, the stamps. My husband, a photographer, made the crossing alone once and was held for two hours while guards went through his equipment, his backpack, his wallet. The determined absorption of different religions and cultures into the shape of a single Yugoslavia after the First and Second World Wars had its problems too, but those Bosnians old enough to remember the time when the many amalgamated into the one speak of it wistfully, softly, as if it were a fairy tale they used to tell themselves as children. Their iceberg was waiting, biding its time out at sea, before it floated inland to lodge itself forcibly among them. The disappearance of the border between North and South Ireland has not sunk our icebergs. But over the past sixteen years, until the schism that was June 23, 2016, we had found ways to float one with the other, moored and not, comfortable and not, settled and not.

Is it possible to hold two contrary ideas at the same time: that sense that merging is both terrifying and monumental, the knowledge that we are all different, but that we live within a common world, that we can choose to be something and not? Although Alice Notley wrote that the birth of her child had left her “undone” — “feel as if I will / never be, was never born” — she could still see the other side: “Of two poems one sentimental and one not / I choose both / of his birth and my painful unbirth I choose both.” She hung in the balance, remained midway, gave herself over to not settling in. The child that was a baby when I began my Maze project recently turned ten and is in process, in transition. She is a self that I am not, although that self, according to Deleuze and Guattari, is only “a threshold, a door, a becoming between two multiplicities.” Her identity is no more fixed than mine is, than mine ever was, for all that I have scrambled to chase it. “What is real,” write the philosophers, “is the becoming itself. The only way to get outside the dualisms is to be-between, to pass between, the intermezzo.” There is confusion, and much relief, in such malleable thinking.

I was wrong, of course, in the assumptions I made about the Maze story. After the initial openness that followed the prison’s closure in 2000, when the paramilitary prisoners were let out and the public allowed in, political wrangling slowly strangled the goodwill until the great gates swung shut again; they have stayed shut, more or less, ever since. When I talked to people in and around the prison about politics and the peace, they felt bitter and hurt and sad, and that was not easy to hear. But any hard edges of fear and certainty seemed also to have blurred into a resignation that meant we could at least stand outside of compartmentalizations and inside the fuzzy space that doubt tends to uncover.

It was almost always cold at the Maze; even during summer, the fog hung heavy over its vast flatness. When I was in need of warming, I would retreat to the small security hut at the entrance to the site, where a handful of guards took phone calls and processed visitors. What I recollect about these visits are the moments of recognition. One of the men, a gentle soul, English-accented, who had lost his wife too early and now lived a simple life of work and extended family, was the chatty type. I still remember how he once articulated my fear. “You dip your finger in a pool of water, swirl it about for a while, and when you take it out, the water will return to the way it was. Then it will be as if you never were.”

***

This essay first appeared in Brick, the biannual print journal of nonfiction based in Canada and read throughout the world. Our thanks to Rachel Andrews and the staff at Brick for allowing us to reprint this essay at Longreads.

 

Dead Girls: An Interview with Alice Bolin

Laura Palmer, Twin Peaks, American Broadcasting Company

Hope Reese | Longreads | July 2018 | 12 minutes (3,114 words)

“It’s clear we love the Dead Girl, enough to rehash and reproduce her story, to kill her again and again,” writes Alice Bolin. “But not enough to see a pattern. She is always singular, an anomaly, the juicy new mystery.”

In her debut collection Dead Girls: Essays on Surviving an American Obsession, Bolin takes aim at what she calls the “Dead Girl Show” — a genre of entertainment that centers around solving the mystery of a dead, or missing, girl. Approaching the subject with deep intellectual curiosity, Bolin dissects texts and manuscripts — from Joan Didion’s nonfiction to Veronica Mars — that reveal how dead “girls” or women have become a trope of entertainment, serving as a vehicle for sleuthing or as a venue to sort out “male problems.” The result is a compelling case that these plotlines are not merely problematic and inaccurate, but are damaging to society.

The “Dead Girl” genre, Bolin tells me, is not just about gender — it’s equally about race. “There is a lot of privilege wrapped up in the dead girl body, and in the ways that the body is sanctified. That’s a better reason than any to let some of these stories go: the overvaluing of a white woman’s body,” she said. “It’s not good for anyone.” Read more…

My Great-Grandfather, the Nigerian Slave-Trader

Longreads Pick

Adaobi Tricia Nwaubani writes about how her family is reckoning with her Igbo great-grandfather’s work in the transatlantic slave trade.

Source: New Yorker
Published: Jul 15, 2018
Length: 14 minutes (3,692 words)

The Far Right’s Fight Against Race-Conscious School Admissions

WASHINGTON, DC - OCTOBER 10: Attorney Bert Rein (L), speaks to the media while standing with plaintiff Abigail Noel Fisher (R), after the U.S. Supreme Court heard arguments in her caseon October 10, 2012 in Washington, DC. The high court heard oral arguments on Fisher V. University of Texas at Austin and are tasked with ruling on whether the university's consideration of race in admissions is constitutional. (Photo by Mark Wilson/Getty Images)

Late in the afternoon on July 3, the Department of Justice announced it was rescinding 24 documents issued by the Obama administration between 2011 and 2016. The documents  offered guidance to a range of constituencies, including homeowners, law enforcement, and employers. Some detailed employment protections for refugees and asylees; seven of the 24 discussed policies and Supreme Court rulings on race-conscious admissions practices in elementary, secondary, and post-secondary schools. In its statement, the DOJ called the guides “unnecessary, outdated, inconsistent with existing law, or otherwise improper.”

No immediate policy change will come from the documents’ removal. It’s more of a signal, a gesture in a direction, a statement about ideology. The Trump administration has already enacted several hard-line positions on immigration. And the Sessions-backed Justice Department has made a habit of signaling, by way of gesture, its opposition to affirmative action, and its belief that race-conscious policies, specifically, often amount to acts of discrimination.

***

The term “affirmative action” is ambiguous and has never been strictly defined. It’s a collection of notions, gestures, and ideas that existed before its present-day association with race. According to Smithsonian, the term was likely first used in the Depression-era Wagner Act. This legislation aimed to end harmful labor practices and encourage collective bargaining. It also mandated that employers found in violation “take affirmative action including reinstatement of employees with or without backpay” to prevent the continuation of harmful practices. The reinstatement and payment of dismissed employees were affirmative gestures that could be taken to right a wrong.

Nearly a decade later, in 1941, under pressure from organizer A. Philip Randolph, President Franklin D. Roosevelt issued Executive Order 8802 to prohibit race-based discrimination in the defense industries during the buildup to WWII. It is considered the first federal action to oppose racial discrimination since Reconstruction, and paved the way for President John F. Kennedy, who was the first to use “affirmative action” in association with race in Executive Order 10925. Kennedy’s order instructed government contractors to take “affirmative action to ensure that applicants are employed,” regardless of “race, creed, color, or national origin.” President Lyndon B. Johnson expanded the scope of Kennedy’s order to add religion when he issued Executive Order 11246 in 1965. Two years later, Johnson amended his own document to include sex on the list of protected attributes.

It was Republican president Richard Nixon who expanded the use of affirmative actions to ensure equal employment in all facets of government in 1969, when he issued Executive Order 11478. Nixon ran for office in 1968 on “law and order” and “tough on crime” messaging. He believed what he called “black capitalism” –- the idea of thriving black communities with high rates of employment and entrepreneurship — would ease the agitations of civil rights groups and end urban unrest. At the time, Nixon’s rhetoric won the support of a smattering of black cultural figures such as James Brown. “Black capitalism” was little more than a co-optation of some of the tenets of Black Power, which itself had come from a long-established line of conservative black political thought that emphasized economic empowerment and independence, self-determination and personal responsibility. In his version, Nixon envisioned only a slight role for the federal government; without the push of significant government investment, the policies and programs he created didn’t result in sweeping change. Still, shadows of Nixon’s thinking on black economics endured: They’re present in multiple speeches Obama made to black audiences during his presidency; Jay Z’s raps about the transformative, generational effects of his wealth; Kanye West’s TMZ and Twitter rants. Also, the backlash Nixon faced is remarkably similar in tone and content to today’s challenges to affirmative action, which typically involve a white person’s complaints about the incremental gains made by members of a previously disadvantaged group:

In 1969 Section 8(a) of the Small Business Act authorized the SBA to manage a program to coordinate government agencies in allocating a certain number of contracts to minority small businesses—referred to as procurements or contract “set-asides.” Daniel Moynihan, author of the controversial Moynihan Report, helped shape the program. By 1971 the SBA had allocated $66 million in federal contracts to minority firms, making it the most robust federal aid to minority businesses. Still, the total contracts given to minority firms amounted to only .1 percent of the $76 billion in total federal government contracts that year.

Yet even these miniscule minority set-asides immediately faced backlash from blue-collar workers, white construction firms, and conservatives, who called them “preferential treatment” for minorities. Ironically, multiple studies revealed that 20 percent of these already meager set-asides ended up going to white-owned firms.

***

A sense of lost advantage and power seems to animate both historical and recent challenges to race-based policies and practices. In Regents of University of California v. Bakke (1978) the first affirmative action case the Supreme Court ruled on, Allan Bakke, a white University of California at Davis medical school applicant, sued the school after being twice denied admission. The school had created a system to set aside a certain number of spaces for students from marginalized groups. The Court decided practices that relied on quota systems were unconstitutional, but it upheld the use of race in admissions decisions as long as it was among a host of other factors. Rulings in subsequent cases, such as Grutter v. Bollinger (2003) and most recently, Fisher v. University of Texas (2016) supported the use of race in admissions and reiterated the federal government’s interest in the diversity of the nation’s institutions. In the most-recent case, now-retired Justice Anthony Kennedy provided the Court’s swing vote.

Plaintiffs in affirmative action challenges tend to argue race-conscious admissions policies violate rights granted by the Fourteenth Amendment, especially its clause guaranteeing “equal protection of the laws.” Ratified 150 years ago last week, the Fourteenth Amendment established birthright citizenship and defined citizenship’s parameters. Its ideas originated in the years leading up to Reconstruction, during “colored conventions” held among African American leaders and activists,  and form the underpinnings of Brown v. Board Education (1954) and some provisions of the Civil Rights Act of 1964.

One of the most prominent opponents of affirmative action, Edward Blum, a fellow at the American Enterprise Institute, actively seeks and recruits aggrieved plaintiffs and attorneys to challenge race-based policies in school admissions and voting practices. Blum was the force behind the complaint of Abagail Fisher, the white student at the center of Fisher v. University of Texas. According to the New York Times:

In the Texas affirmative action case, he told a friend that he was looking for a white applicant to the University of Texas at Austin, his own alma mater, to challenge its admissions criteria. The friend passed the word to his daughter, Abigail Fisher. About six months later, the university rejected Ms. Fisher’s application.

“I immediately said, ‘Hey, can we call Edward?’” she recalled in an interview.

The case went to the Supreme Court twice, and though Ms. Fisher was portrayed as a less than stellar student, vilified as supporting a racist agenda, and ultimately lost, she said she still believed in Mr. Blum. “I think we started a conversation,” she said. “Edward obviously is not going to just lie down and play dead.”

Blum’s first lawsuit came about after he lost a Congressional election in Houston because, he felt, the boundaries of his district were drawn solely along racial lines. He is now behind lawsuits against Harvard University and the University of North Carolina at Chapel Hill, which allege the schools’ admissions policies discriminate against Asian American applicants. It is interesting and bold to use white women and Asian American students to dismantle programs meant to address America’s legacy of discrimination. Both groups have benefited significantly from Reconstruction and Civil Rights-era policies and legislation. Do Blum, Sessions, and their supporters believe race-based policies are irrelevant, illegal, or improper because for many, they’ve worked? I sense something more nefarious at play, such as a mounting sense of loss and growing resentment that the demographic shifts in our country also mean inevitable shifts in who holds power.

The Sessions-helmed Justice Department’s signals and the nomination of Judge Brett Kavanaugh to the high court, have, I’m sure, heartened activists like Blum. For the Nation, Eric Foner wrote about how the Fourteenth amendment’s ambiguity is what allows it to be used in a way that is so at odds with the spirit of its origins. It is that ambiguity, he says, that will allow, someday, in a different political climate, for another era of correction.

Sources and further reading:

Clocking Out

Getty

Livia Gershon | Longreads | July 2018 | 9 minutes (2,261 words)

On May 1, 1886, 80,000 workers marched through the streets of Chicago. As soldiers and private police aimed their rifles into the crowd, “no smoke curled up from the tall chimneys of the factories and mills,” the Tribune reported. “Things had assumed a Sabbath-like appearance.” Chicago, an industrial boomtown, was the center of what became that day a mass labor action; more than 300,000 workers staged a strike across the country. The participants were skilled and unskilled, immigrant and native-born, revolutionary and reformist. What drew them together was a common demand, expressed in a popular labor song that many of the marchers sang: “We want to feel the sunshine / And we want to smell the flow’rs / We are sure that God has willed it / And we mean to have eight hours.

Read more…

The Wheel, the Woman, and the Human Body

CTK via AP Images

Margaret Guroff | The Mechanical Horse | University of Texas Press | April 2016 | 35 minutes (4,915 words)

Angeline Allen must have been pleased. On October 28, 1893, the 20-something divorcée, an aspiring model, made the cover of the country’s most popular men’s magazine, a titillating journal of crime, sport, and cheesecake called the National Police Gazette. Granted, the reason wasn’t Allen’s “wealth of golden hair” or “strikingly pretty face,” though the magazine mentioned both. Rather, the cover story was about Allen’s attire during a recent bicycle ride near her Newark, New Jersey, home. The “eccentric” young woman had ridden through town in “a costume that caused hundreds to turn and gaze in astonishment,” the Gazette reported.

The story’s headline summed up the cause of fascination: “She Wore Trousers” — dark blue corduroy bloomers, to be exact, snug around the calves and puffy above the knees. “She rode her wheel through the principal streets in a leisurely manner and appeared to be utterly oblivious of the sensation she was causing,” according to the reporter.

It is unlikely Allen was truly oblivious, having already shown an exhibitionistic streak over the summer when she appeared on an Asbury Park, New Jersey, beach in a bathing skirt that “did not reach within many inches of her knees,” according to a disapproving newspaper report. (“Her stockings or tights were of light blue silk,” the report added.) Allen didn’t mind people noticing her revealing outfits — “that’s what I wear them for,” she told one reporter — and she kept cycling around Newark in pants despite the journalistic scolding. As another paper reported that November, “The natives watch for her with bated breath, and her appearance is the signal for a rush to all the front windows along the street.”

For a grown woman to reveal so much leg in public was a staggeringly brazen act. What was noticeably unnoteworthy by then was Allen’s choice of vehicle. Ten years earlier, all bicycles had been high-wheelers, and riding one had been largely the province of daring, athletic men. The women who had attempted it were seen as acrobats, hussies, or freaks; one female performer who rode a high-wheeler in the early 1880s was perceived as “a sort of semi-monster,” another woman reported. But by the early 1890s, the bike had undergone a transformation. Allen’s machine — a so-called safety bicycle — had two thigh-high wheels; air-filled rubber tires; and rear-wheel drive, with a chain to transmit power from the pedals. In fact, it looked a lot like a 21st-century commuter bike, and it had become nearly as acceptable as one. Even the fashion police who scorned Allen’s riding outfit didn’t object to her riding.

What had happened to the bicycle in the interim? Market expansion. In the 1880s, when bicycle makers had begun to saturate the limited market for high-wheelers, they sought products to entice other would-be riders, particularly men who had aged out of the strenuous high-wheel lifestyle. In the United States, where bad roads made tricycle ridership impractical, the sales potential for an easy-to-ride bicycle looked stronger than in Europe. In response, manufacturers on both sides of the Atlantic created a profusion of high-tech two-wheelers, including models with foot levers instead of pedals; “geared up” bikes with chains and sprockets that spun the driving wheel more than once for each rotation of the cycle’s cranks; and a supposedly header-proof version with the small wheel in the front and the big wheel in the rear. Riders and makers started calling the standard high-wheeler an “Ordinary” to distinguish it from experimental models.

Several of the new bikes used geared-up rear-wheel drive as a way to bring the rider closer to the ground. The most influential of these was the English Rover, with a rear driving wheel only thirty inches tall that had as much force as a 50-inch Ordinary wheel. (Even today, American bicycle gears are measured in “gear inches,” which indicate how tall an Ordinary wheel of equivalent force would be.) At 36 inches, the Rover’s front wheel was slightly bigger than its rear one, but apart from that, the machine looked as streamlined as some models of fifty or a hundred years later.

Introduced in England in 1885, the Rover Safety Bicycle delivered the speed of an Ordinary, but with a greatly diminished risk of skull fracture from flying over the handlebars. The Rover’s manufacturer made some quick refinements, and a model with same-sized wheels caught on in Britain and inspired a fleet of imitators: low-mount, rear-wheel-drive bikes also called “safeties.”

The major US manufacturers weren’t impressed by this new low profile, though; they dismissed the safety style as a mistake. In 1886, after a two-month tour of England’s bicycle factories, the US industry titan Albert Pope expressed confidence in his high-wheeler: “I looked at nearly all the principal [English] makes and I could not find a point that was in any way an improvement over our own.” Echoed his lieutenant, George H. Day, who also made the trip, “Every innovation is regarded as a trap.”

But when imported safeties hit the US market in the spring of 1887, the machines found eager buyers; Pope and other American cycle makers scrambled to put out their own versions of the header-resistant contraptions. By November, the safety bicycle was established in the United States as the modern option for men, even though its low wheels evoked the comically old-timey velocipede of 20 years prior, as one bard made clear in the accented voice of an immigrant child:

In days of old, full many a time
You’ve heard it told, in prose and rhyme,
How down the street a wheelman came,
And chanced to meet his beauteous flame
Just where a pup in ambush lay,
To tip him up upon the way,
And make him wish that he was dead,
While gyrating upon his head.
In days of old
You’ve heard it told.
But nowadays, it’s otherwise.
The safety craze new joy supplies;
The boulders lose their terrors grim,
Stray cans and shoes are naught to him;
He laughs at rocks, he kicks the pup,
But, in the end, things even up;
For, as his maid he gayly greets,
Some unwashed urchin always bleats —
“Hi, look at der big man on der melosipetes!”

For a short time, Ordinaries and safeties coexisted like Neanderthals and Homo sapiens, with the bigger, older species continuing to inhabit its traditional niche while the smaller, nimbler creature carved out a new one. “I do not think that [the safety] will hurt the sale of the Ordinary bicycle,” predicted one US industry watcher in late 1887. “It will open the pleasures of cycling to a great many who have been afraid to venture upon a high machine.” The writer was thinking of physicians and other “professional men” for whom an Ordinary was too dangerous, but some enthusiasts suspected that the safety would also appeal to female riders. Offering women “a clumsy wheelbarrow of a tricycle” to ride while men zip around on slender bikes, wrote one sympathetic man, “is offering a woman a stone to eat while men have soft biscuit.”

And the safety bicycle’s low profile did intrigue many American women, especially after the spring of 1888, when makers offered a drop-frame version, in which the bike’s top bar scooped downward to make room for a lady rider’s long skirts. As one woman reported that year, “A sudden desire began to awake in the feminine mind to ascertain for itself by personal experience, what were those joys of the two-wheeler which they had so often heard boastfully vaunted as superior, a thousand times, to the more sober delights of the staid tricycle.”

With the safety’s smaller wheels, its ride was bumpier than the Ordinary’s at first. But then came the pneumatic tire. Devised in Ireland in 1888 by a veterinarian named John Boyd Dunlop, who was seeking a faster ride for his son’s trike, the air-filled rubber tube cushioned the road’s ruts and bulges in a way that springs and other early shock-absorbing devices never could. This marvel arrived in the United States by 1890 and became standard equipment on American safeties within a few years. “It permitted travel on streets and roads previously thought unrideable,” recalled an American journalist of the time, “and added to cycling a degree of ease and comfort never dreamed of.”

In the 1890s, bikes got lighter as well as more comfortable. The average weight of a bicycle dropped by more than half during the decade’s first five years, falling from 50 pounds to 23. And since new gearings were able to mimic wheels larger than those of the largest Ordinary, speed records fell too. In 1894, while riding a pneumatic-tired safety around a track in Buffalo, New York, the racer John S. Johnson went a mile in just over one minute and thirty-five seconds, a rate of nearly thirty-eight miles an hour. He beat the previous mile record for a safety by fourteen seconds, and the record for an Ordinary by nearly a minute–and the record for a running horse by one-tenth of a second.

The Ordinary — which had by then acquired the derisive nickname of “penny-farthing,” after the old British penny and much smaller farthing (quarter-penny) coins ─ became obsolete. High-wheelers that had sold for $150 to $300 just a year or two earlier were going for as little as $10.

The first safeties, meanwhile, cost an average of $150 during a time when the average worker earned something like $12 a week. At such prices, the new bikes targeted the same upscale demographic as the tricycle. But a strong market for safeties among well-to-do women goosed production, and competition among manufacturers reduced prices, making the bikes affordable to more would-be riders — and further fueling demand. In 1895, America’s 300 bicycle companies produced 500,000 safeties at an average price of $75, according to one encyclopedia’s yearbook. Even manufacturers were surprised at the demand among women, who thrilled to the new machine’s exhilarating ride. As one female journalist wrote, “If a pitying Providence should suddenly fit light, strong wings to the back of a toiling tortoise, that patient cumberer of the ground could hardly feel a more astonishing sense of exhilaration than a woman experiences when first she becomes a mistress of her wheel.”

It wasn’t just that women enjoyed the physical sensation of riding — the rush of balancing and cruising. What made the bicycle truly liberating was its fundamental incompatibility with many of the limits placed on women. Take clothing, for example. Starting at puberty, women were expected to wear heavy floor-length skirts, rigid corsets, and tight, pointy-toed shoes. These garments made any sort of physical exertion difficult, as young girls sadly discovered. “I ‘ran wild’ until my 16th birthday, when the hampering long skirts were brought, with their accompanying corset and high heels,” recalled the temperance activist Frances Willard in an 1895 memoir. “I remember writing in my journal, in the first heartbreak of a young human colt taken from its pleasant pasture, ‘Altogether, I recognize that my occupation is gone.’” Reformers had been calling for more sensible clothing for women since the 1850s, when the newspaper editor Amelia Bloomer wore the baggy trousers that critics named after her, but rational arguments hadn’t made much headway.

Where reason failed, though, recreation succeeded. The drop-frame safety did allow women to ride in dresses, but not in the swagged, voluminous frocks of the Victorian parlor. Female cyclists had to don simple, “short” (that is, ankle-length) skirts in order to avoid getting them caught under the bicycle’s rear wheel. And to keep them from flying up, some women had tailors put weights in their hems or line their skirt fronts with leather. Other women, like Angeline Allen, shucked their dresses altogether and wore bloomers. The display that reporters had deemed shocking in 1893 became commonplace just a few years later as more and more women started riding. “The eye of the spectator has long since become accustomed to costumes once conspicuous,” wrote an American journalist in 1895. “Bloomer and tailor-made alike ride on unchallenged.” (For her part, Allen may well have given up riding, but not scandal; she progressed to posing onstage in scanty attire for re-creations of famous paintings, a risqué popular amusement.)

Bicyclists’ corsets changed too, though less publicly. The corset of the 1880s was an armpit-to-hip garment stiffened with whalebone stays, which helped the hips support heavy skirts that hung from the waist. But while corsets braced women’s torsos, they also weakened their wearers, squeezing women’s lungs and displacing other internal organs, making deep breaths impossible. Out of necessity, female cyclists looked for alternatives, and many chose another garment that had been advocated by dress reformers decades earlier: a sturdy, waist-length cotton camisole with shoulder straps. When introduced in the 1870s, this garment was called an “emancipation waist,” and it featured a horizontal band of buttons at the hem, to which drawers or a skirt could be attached. Later versions were named “health waist” or, finally, “bicycle waist.” One 1896 model included elastic insets; its maker promised the wearer “perfect comfort — a sound pair of lungs — a graceful figure and rosy cheeks.” All for $1, postpaid.

If women’s clothing constrained them, so did their role in society. More Americans than ever worked outside the home; by 1880, farmers made up a little less than half of the country’s labor force. But even among the urban working class, married women typically stayed home during the day to cook, clean, tend to children, and often manufacture homemade goods for sale. Meanwhile, their husbands, sons, and unmarried daughters toiled in factories, shops, offices, and other people’s houses. Many Americans came to believe that men and women naturally inhabited two separate spheres: men held sway in business, politics, and other public arenas, and women took charge of the home. For most middle-class women, respectability meant appearing in public only under certain circumstances ─ such as while shopping ─ and making as small an impression as possible. “A true lady walks the streets unostentatiously and with becoming reserve,” instructed an 1889 etiquette manual. “She appears unconscious of all sights and sounds which a lady ought not to perceive.”

In addition, an unmarried young woman didn’t go out without a chaperone, usually an older female relative. Being seen on an unchaperoned date, even at a restaurant or other public place, could be cause for social ruin. An 1887 etiquette guide warned against sailing excursions, for example, lest the boat be becalmed overnight: “A single careless act of this sort may be remembered spitefully against a girl for many years.”

The bicycle challenged all that. Wives who had stayed close to home — venturing out only on foot, by trolley, or, if wealthy, with a driver and horse-drawn carriage — were suddenly able to travel miles on their own. Being so mobile, and so visible, was a revelation to many. “The world is a new and another sphere under the bicyclist’s observation,” wrote one female journalist. “Here is a process of locomotion that is absolutely at her command.” If a woman’s sphere begins to feel too small, wrote another, “the sufferer can do no better than to flatten her sphere to a circle, mount it, and take to the road.”

As for unmarried women, manners mavens urged them to cycle only with chaperones, but the rule didn’t take. “New social laws have been enacted to meet the requirements of the new order,” reported one newspaper editor in 1896. “Parents who will not allow their daughters to accompany young men to the theatre without chaperonage allow them to go bicycle-riding alone with young men. This is considered perfectly proper.” According to the editor, the reason for this difference was the “good comradeship” of the bicycling set. Fellow enthusiasts looked out for one another on the road, he wrote ─ so in a way, every ride was supervised. The historian Ellen Gruber Garvey suggests a second possible reason: propriety already allowed unmarried women to ride horses unchaperoned. Bicycles, as a less costly equivalent, may simply have extended this freedom down the economic scale.

But the same things that made the bicycle liberating also made it threatening. Moralists warned that skimpy costumes and unsupervised travel would lead to wanton behavior. “Immodest bicycling by young women is to be deplored,” declared Charlotte Smith, founder of the Women’s Rescue League, a group that lobbied Congress on behalf of “fallen women.” “Bicycling by young women has helped to swell the ranks of reckless girls, who finally drift into the standing army of outcast women.” Smith reported that her tours of brothels and interviews with prostitutes confirmed this.

Physicians — who at the time shouldered responsibility for patients’ moral as well as physical well-being — had their own concerns. One visited New York’s Coney Island and saw a 16-year-old cyclist get drunk on wine provided by a beautiful but nefarious older woman. “She looked like an innocent child, but was away from home influence,” the doctor reported. Many physicians fretted that pressure from the bicycle seat would teach girls how to masturbate, a practice thought to lead to spiritual and psychological decline. Climbing hills on a bike could excite “feelings hitherto unknown to, and unrealized by, the young girl,” wrote one doctor in 1898. (Boys faced the same danger: pressure on the perineum would call their attention to the area, warned one doctor, “and so lead to a great increase in masturbation in the timid [and] to early sexual indulgence in the more venturous.”)

The bicycle’s peril was medical as well as moral. In the late nineteenth century, many saw physical energy as a finite resource that had to be carefully parceled out, not a power that could be renewed through exercise. The fashionable malaise of neurasthenia was only one of the disorders thought to be caused by a depletion of energies. Overexertion could also cause tuberculosis, scoliosis, hernias, heart disease, and other maladies, doctors believed. Safely sedentary middle-class women, who frequently suffered from varicose veins and other consequences of annual pregnancies, were prone to fatigue; one Boston writer called them “a sex which is born tired,” adding that “society sometimes seems little better than a hospital for invalid women.” Particularly for women in heavy dresses and constricting corsets, any activity that raised the heart rate could seem more likely to be the cause of fainting and listlessness than their remedy. Opponents of the bicycle latched onto this perception, arguing that riding would cost women more effort than they could afford. “The exertion necessary to riding with speed … is productive of an excitation of nervous and physical energy that is anything but beneficial,” Charlotte Smith warned. “If a halt is not called soon, 75 percent of the cyclists will be an army of invalids within the next ten years.”

But even as Smith made her dire predictions, Americans’ fear of cardiovascular exercise was beginning to lift. For decades, health reformers had trumpeted the benefits of fitness, and during the 1880s, the United States saw a spike in organized physical activity. Citizens of America’s growing cities tried new sports such as baseball and football, and exercise advocates built the first public playgrounds and pushed for physical education for both boys and girls. Doctors continued to caution against overexertion, but they acknowledged that, in moderation, fresh air and exercise tended to improve patients’ health. The high-wheel bicycle of the 1880s proved the benefits of regular exercise to those who could ride it; proponents made extravagant claims for the risky machine’s ability to restore well-being. “For constipation, sleeplessness, dyspepsia, and many other ills which flesh is heir to, not to speak of melancholy,─all are curable, or certainly to be improved, by the new remedy, ‘Bicycle,'” wrote a Texas physician in 1883. “It is always an excellent prescription for the convalescents, and nearly always for chronic invalids.”

Not everyone could take the prescription, though. High-wheeled cycling and rigorous team sports were acceptable only for young men. The new games deemed suitable for mixed company, such as lawn tennis and golf, were far less taxing — and therefore far less likely to lead to noticeable improvements in fitness. As for working out on your own, the recommended options were either too costly (horseback riding) or too boring (indoor calisthenics) to gain much popularity. As a result, many more Americans of the 1880s thought they ought to exercise than actually did it. So when the safety bicycle appeared at the end of the decade and Americans began riding in large numbers — an estimated two million by 1896, out of a population of about seventy million — few were certain how such vigorous physical activity would affect them.

Doctors were wary. Most US physicians believed that each patient’s condition was based largely on his or her habits and experiences, the weather, and other environmental factors. Good health was a reflection of proper balance among bodily systems and energies. “A distracted mind could curdle the stomach, a dyspeptic stomach could agitate the mind,” writes the medical historian Charles Rosenberg. It was a doctor’s job to know each patient well enough to restore balance when something was out of whack, using laxatives, diuretics, and other purging drugs to reboot the system. Even contagious diseases could not be treated in a cookie-cutter fashion, argued an 1883 medical journal editorial: “No two instances of typhoid fever, or of any other disease, are precisely alike … No ‘rule of thumb,’ no recourse to a formula-book, will avail for proper treatment even of the typical diseases.” To many doctors, advocating a specific drug to cure a specific disease seemed the height of quackery.

And just as there were no one-size-fits-all medical treatments, many physicians believed there were no one-size-fits-all exercise routines. While cycling enthusiasts rhapsodized about the safety bicycle’s benefits for riders of both sexes and all ages, doctors fretted that many of their patients would be harmed by the new machines. Even seeming success stories were suspect. In an 1895 paper on heart disease, one doctor reported that a patient who had panted for breath after climbing one flight of stairs was now able to cycle up hills with ease. “It would be wrong to conclude from this that cycling is not injurious,” the doctor wrote: there hadn’t yet been time to observe the bicycle’s long-term effects. Moreover, as an unfamiliar activity, cycling tended to catch the blame for pretty much anything bad that happened to a new rider afterward, up to and including death.

Logically, acute injuries were a concern. Though the safety bicycle did greatly reduce the risk of head wounds, it didn’t obliterate that risk, particularly among “scorchers” — thrill-seeking youngsters who hunched over their handlebars and pedaled as fast as they could. “It might seem almost impossible to fracture a skull thick enough to permit indulgence in such practices,” reported the Boston Medical and Surgical Journal, “but the bicycle fool at full speed has been able to accomplish it.” Medical journals also noted the danger of road rash and broken bones.

More insidious than crash injuries, though, were new chronic complaints attributed to cycling. The bent-over posture of the scorcher was thought to cause a permanent hunch called “kyphosis bicyclistarum,” or, familiarly, “cyclist’s stoop.” Repeated stress to the cardiovascular system — that is, regular workouts — could lead to the irregular heartbeats and poor circulation of “bicycle heart.” Gripping the handlebars too tightly might cause finger numbness, or “bicycle hand,” and a dusty ride could trigger “cyclist’s sore throat.” Practically every body part seemed to have its own cycle-related malady; at least one New York doctor devoted his entire practice to treating such ailments.

Of all the physical woes attributed to the bike, the one that most strained credulity was the “bicycle face.” Characterized by wide, wild eyes; a grim set to the mouth; and a migration of facial features toward the center, the disorder was said to result from the stress of incessant balancing. A German philosopher claimed that the condition drained “every vestige of intelligence” from the sufferer’s appearance and rendered children unrecognizable to their own mothers. The bicycle face hung on, too, warned a journalist: “Once fixed upon the countenance, it can never be removed.”

The doctors raising these alarms were careful to state that many of the new diseases affected only cyclists predisposed to them — which would explain why so few of their fellow physicians might have encountered the disorders. “Whilst thousands ride immune, a small percentage will suffer,” wrote one doctor. Another, who blamed cases of appendicitis, inflammatory bowel disease, and the thyroid condition Graves’ disease on excessive riding, said it didn’t matter how many people believed that cycling had improved their health: “It would not affect my argument in the least if swarms of them had been rescued from the grave.”

Nevertheless, the more Americans took to bicycling, the more tenuous these claims of danger came to seem. The machine made physical activity both practical and fun. “The bicycle is inducing multitudes of people to take regular exercise who have long been in need of such exercise, but who could never be induced to take it by any means hitherto devised,” one doctor wrote in Harper’s Weekly in 1896. And all that activity had an effect. Riders quickly noticed improved muscle tone, increased strength, better sleep, and brighter moods. Women, especially, transformed themselves, wrote the novelist Maurice Thompson in 1897: “We have already become accustomed to seeing sunbrowned faces, once sallow and languid, whisk past us at every turn of the street. The magnetism of vivid health has overcome conservative barriers that were impregnable to every other force.”

The empirical evidence of cycling’s health value began to overtake conservative doctors’ concerns, as the rhetoric scholar Sarah Overbaugh Hallenbeck argues. Though many physicians continued to raise objections to the sport, their voices were increasingly drowned out by those of more observant — and pragmatic–practitioners. “The bicycle face, elbow, back, shoulders, neck, eroticism,” wrote one military doctor in 1896, “I pass as not worthy of serious consideration.” Rather than discourage bicycle use, most physicians came to cautiously endorse it. “So long as the cyclist can breathe with the mouth shut,” wrote one such doctor in 1895, “he is certainly perfectly safe.” Some went further, citing evidence of the bike’s benefits for heart patients, migraine sufferers, diabetics, and others with chronic conditions. In Chicago, the demand for injectable morphine dropped as patients with anxiety or insomnia “discovered that a long spin in the fresh air on a cycle induces sweet sleep better than their favorite drug,” the Bulletin of Pharmacy reported.

This shift paralleled a transformation in medical thinking during the 1890s, when American physicians increasingly embraced the scientific method. Some clinics in Continental Europe had adopted this evidence-based approach early in the nineteenth century, using statistics to determine the efficacy of treatments and evaluating patients’ conditions according to universal norms, rather than trying to divine what was normal for each individual patient. In the United States, however, doctors arguing for this approach were long in the minority. According to Rosenberg, the rift between medical traditionalists and empiricists “provided an emotional fault line which marked the profession throughout the last two-thirds of the century.” Only at the very end of the nineteenth century did a research-based, objective philosophy take hold at US medical schools.

It would be folly to suggest that the bicycle alone caused this transformation. Many other factors were at play, such as improved trans-Atlantic communication; an influx of European immigrants, including scientists; and a snowballing of evidence for new medical concepts such as the germ theory of disease. For centuries, Western healers had believed that contagion could erupt spontaneously, but between 1870 and 1900, researchers disproved this theory by isolating the microscopic causes of illnesses including typhoid, tuberculosis, cholera, diphtheria, meningococcal meningitis, plague, and malaria.

But even if the bike did not independently modernize American medicine, its unprecedented impact on fitness — and the clash this revealed between what doctors said and what experience showed — may well have accelerated the shift. Much as the bicycle triggered changes in women’s dress that high-minded advocacy could not, it bolstered scientists’ then-radical argument that what is good for one human body tends to be just as good for another.

To the bicycle faithful of the 1890s, this seemed to be just the beginning of the changes that the machine would bring about. The gulf between social classes would recede under the influence of this “great leveler,” one enthusiast wrote in the Century Magazine: “It puts the poor man on a level with the rich, enabling him to ‘sing the song of the open road’ as freely as the millionaire, and to widen his knowledge by visiting the regions near to or far from his home, observing how other men live.”

And while women may not yet have had full access to higher education ─ or even the right to vote — the unchaperoned, self-propelled bloomer girl seemed to be pedaling in that direction. “In possession of her bicycle, the daughter of the 19th century feels that the declaration of her independence has been proclaimed,” wrote one female journalist, “and, in the fulness of time, all things will be added to complete her happiness and prosperity.”

The first-wave feminist Susan B. Anthony was born in 1820, the year after Charles Willson Peale built his iron draisine. By the time of the safety bicycle boom of the 1890s, she was a snowy-haired eminence, too old to risk riding, but she had an opinion of the sport. “I’ll tell you what I think of bicycling,” she said in an 1896 newspaper interview as she leaned forward to lay a hand on the reporter’s arm. “I think it has done more to emancipate woman than any one thing in the world.”

***

From The Mechanical Horse: How the Bicycle Reshaped American Life. Copyright © 2016 by Margaret Guroff. All rights reserved, with permission of the University of Texas Press.