Search Results for: The Nation

A Girl’s Guide to Missiles

AP Photo/Phil Sandlin, File

Karen Piper | A Girl’s Guide to Missiles | Viking | August 2018 | 38 minutes (7,502 words)

Don’t touch any ordnance,” the guide said. “If you see any lying around. It could explode.” Fiftyish and portly, he was wearing jeans and a T­shirt and might have passed for a truck driver if not for the B­2 bomber on his cap. Above the plane, the hat read “Northrop,” where I assumed he must have worked, maybe even on the B­2. The group of twenty or so tripod­toting tourists, there to photograph the largest collection of petroglyphs in the Western Hemisphere, looked around warily. A few people laughed, others fidgeted. Only my mom and I knew that we really could explode.

“Ordnance, what’s ordnance?” the woman next to me whispered with a plaintive smile as we began our walk into the canyons. One glance at her tripod made me worry. It was almost as tall as her, and she looked wobbly already.

“Missiles, bombs, that sort of thing,” I said. She stopped and stepped back, her smile dropping. What did she expect? I thought. We were at China Lake Naval Weapons Center, after all. Things were supposed to explode. Read more…

A British Seaweed Scientist Is Revered in Japan as ‘The Mother of the Sea’

Pahala Basuki / Unsplash, Algonquin Books

Susan Hand Shetterly | Excerpt adapted from Seaweed Chronicles: A World at the Water’s Edge | Algonquin Books | August 2017 | 16 minutes (4,260 words)

Occasionally you can still find them out on islands, crumbling near the water’s edge, the old eighteenth- and nineteenth-century kilns built out of stones gathered from the shore. People on the Irish and Scottish coasts and in Brittany cut and burned seaweeds in the pits of those kilns to make potash and pearl ash, valuable potassium salts. The wet seaweeds — AscophyllumFucus, and the kelps — had to be lugged up from the shore, carefully turned and dried, and then burned at a temperature that would render them into products that were sold to make glass and soap, to bleach linens, to encourage bread to rise, and to use as fertilizer to sweeten fields. In the boom time, around 1809, Ireland was exporting about 5,410 tons of potash a year. It was backbreaking work that whole neighborhoods engaged in, and at its height, the many kiln fires created smoke so thick it endangered the lives of nearby pasturing cows. It wasn’t long before the seaweeds in some places were overcut, the shores laid bare.

Then, as suddenly as it had appeared, the market vanished when potassium salt deposits were discovered underground in Germany and in Chile, and mines were opened.

The burning of seaweed resurfaced with the discovery that the ash residue could be used to extract iodine. But that, too, disappeared when deposits of iodine were found belowground. Left alone, seaweeds regrew, with farmers coming to the shore to harvest them for their gardens, and gatherers cutting favorite species to eat and to feed to their domestic animals. Over time, the old kilns were disassembled by wind and rain and snow. Read more…

Happy, Healthy Economy

Francesca Russell / Getty

Livia Gershon | Longreads | August 2018 | 8 minutes (2,015 words)

In 1869, a neurologist named George Beard identified a disease he named neurasthenia, understood as the result of fast-paced excess in growing industrial cities. William James, one of the many patients diagnosed, called it “Americanitis.” According to David Schuster, the author of Neurasthenic Nation (2011), symptoms were physical (headaches, muscle pain, impotence) and psychological (anxiety, depression, irritability, “lack of ambition”). Julie Beck, writing for The Atlantic, observed that, among sufferers, “widespread depletion of nervous energy was thought to be a side effect of progress.”

Recently, there have been a number of disconcerting reports that one might view as new signs of Americanitis. A study by the Centers for Disease Control found that, between 1999 and 2016, the suicide rate increased in nearly every state. Another, from researchers at the University of Michigan, discovered that, over the same period, excessive drinking, particularly among people between the ages of 25 to 34, correlated with a sharp rise in deaths from liver disease. A third, by University of Pittsburgh researchers, suggests that deaths from opioid overdoses, recognized for years as an epidemic, were probably undercounted by 70,000.

Read more…

Weird in the Daylight

Photo by Todd Gunsher

Corbie Hill | No Depression | Spring 2018 | 20 minutes (4,135 words)

The apocalypse came early to Maiden Lane.

The houses on this short dead-end road stand empty and condemned, their doors yawning open and letting in the weather. Just a few hundred yards away, traffic buzzes on Hillsborough Street, a main thoroughfare in Raleigh, North Carolina, that borders N.C. State University. Here, though, everything is uninhabited and decaying. Someone has spray painted “fuck frats” in bold red across the face of the first house on the left, a little single-story blue place in the shadow of a spotless new building. Skillet Gilmore walks into the dilapidated structure without hesitation.

“Karl [Agell] from Corrosion [of Conformity] lived here,” he says. Then he goes room by room, naming other friends who lived in them in the 1990s.

The place is completely wrecked. The floors complain underfoot as if they could give way, and there are gaping holes where the heater vents used to be. The fireplaces have been disassembled with sledgehammers.

“Honestly, it doesn’t look that much worse than it did,” Gilmore offers.

He would know. The house Gilmore lived in at one point in the ‘90s is farther down Maiden Lane on the right. So he and Caitlin Cary, who both played in storied Raleigh alt-country band Whiskeytown and who now are married, lead me into that one, too, again going room by room and naming the previous occupants. A construction truck idles outside and across the street, but nobody comes out to tell us to leave. Nobody bothers us at all. Read more…

A Thereness Beneath the Thereness: A Jonathan Gold Reading List

Jonathan Gold poses for a portrait during the 2015 Sundance Film Festival in Park City, Utah. (Photo by Larry Busacca/Getty Images)

For the past four decades, Jonathan Gold tirelessly catalogued the ebb and flow of cuisine in Los Angeles, and in the process, became known as the “food writing poet” of the city. That poet, who was diagnosed with pancreatic cancer this past month, died last week at the age of 57. In his New York Times obituary Ruth Reichl, who published Gold in Gourmet magazine, said of the writer-critic,

Before Tony Bourdain, before reality TV and ‘Parts Unknown’ and people really being into ethnic food in a serious way, it was Jonathan who got it, completely. He really got that food was a gateway into the people, and that food could really define a community. He was really writing about the people more than the food.

According to David Chang, no one knew more about Korean cuisine than Gold, and the critic, whose career began as a music journalist, became the foremost expert on the various regions of the world. Some opine his speciality was Mexican and Central American cooking, having eaten at every pupuseria, taco stand, and restaurant along the 15.5 mile stretch of Pico Boulevard. But really, Gold’s expertise wasn’t limited by borders. Read more…

A Woman’s Work: The Art of the Day Job

Carolita Johnson | Longreads | August 2018 | 19 minutes (4,656 words)

At first I was worried about saying my first day job was as a model in Paris, because I don’t want to infuriate people out there who have certain very hard-to-shake preconceptions (involving envy and scorn, simultaneously) about models and modeling. But you know what? Screw it. My first day job was as a model in Paris.

This is how it happened.

I was a fashion design student at Parson’s School of Design back in 1984. A reluctant one. I had wanted to go to SUNY Stonybrook to be an English Major, another thing that infuriates certain demographics, particularly the one my parents belong to: firmly middle class, non-college-educated first-generation Americans. They, with visions gleaned from TV sitcoms and 1950s movies of “mad men of advertising” in their heads, decided they’d rather see themselves dead — “over my dead body” said my father, only the second time in his life, the first being when I asked for bagpipe lessons — and made me go to art school instead. Who ever heard of that? But yes:

Dad: "I'm not paying to turn you into some kind of pathetic... English Major. Me, thinking: "There's got to be a way to judo-flip this crap to my advantage." Mom: You have talent! Why hide it under a bush?" Me: "So, I can draw. So what! And it's a bushel. Also way to abuse bible verses in the name of capitalism!"

I fought them to at least let me go to Parson’s, because of the BFA in Liberal Studies that was attached to the art degree on offer, unlike F.I.T. at the time, which only offered certificates but was cheaper and therefore more attractive to my dad. I posited that neither of my brothers wanted to attend college, and it wasn’t like I was asking to go to medical school, so they were getting off easy. Also, after raiding my dad’s dresser and finding his bank book, which explained why I’d been turned down for every kind of financial aid I’d applied for, I shamelessly blackmailed him with the terrifying specter of my mother’s rage if she were to find out he was limiting my access to a better, more high class diploma, which he could perfectly afford. Education was everything in our house, right up there next to financial security and a constant sense of unspecified shame.
Read more…

The Cowboy Image and the Growth of Western Music

Photo by Michael Ochs Archives/Getty Images

Bill C. Malone and Tracey Laird | Country Music USA | University of Texas Press | June 2018 | 25 minutes (6,531 words)

The emergence of the western image in country music was probably inevitable. Long before the process of commercialization began, the cowboy had been the object of unparalleled romantic adulation and interest. Given the pejorative connotations that clung to farming and rural life, the adoption of cowboy clothing and western themes was a logical step for the country singer.

The increased emphasis on western themes and attitudes appeared unsurprisingly in the westernmost southern states ─ Louisiana, Oklahoma, Texas ─ and in California. In these areas, country music assumed forms differing from those in the more easterly southern states. Oklahoma, Louisiana, and Texas, although southern in traditional orientation, embodied significantly different elements. All three were touched by the oil boom of the early twentieth century, and each possessed population groups that stood apart culturally while simultaneously influencing the dominant “Anglo” element of the state. Oklahoma and Texas were settled, for the most part, by former residents of the older southern states, who had brought with them their values, traditions, and institutions. Louisiana, on the other hand, can be perceived as a land of at least three great cultures: a Roman Catholic, “Latin” culture in the South; an “Anglo,” Protestant culture in the north; and an African American culture whose influence could be felt throughout the state. Immigrants brought slaves and the cotton culture to all parts of the Southwest, making Texas and Louisiana parts of the southern economic and political orbit. They also transported their evangelical Protestantism to southwestern soil and brought with them many features of their folk heritage. Some of the old British ballads survived the westward migration, although they had lost many of their former characteristics. In some Texas communities, such as those found in the Big Thicket, a heavily forested area in the eastern part of the state, old ballads and old styles of singing endured well into the twentieth century. Many of the East Texas communities were, and remain, replicas of the older southern environment. And, in many of them, folk traditions died slowly.

Listen to music writer Will Hermes’ interview with Bill Malone and Tracey Laird on the Longreads Podcast here (read as transcript).

Texas folk music, then, was basically southern derived. Texas rural musicians used instruments common to the rest of the South, sang in styles similar to those of other rural southerners, frequently attended house parties where old-time fiddlers held sway, and learned to read music at the shape-note singing schools. But despite its close cultural affiliation with the South, Texas had a culture all its own ─ a culture produced by the mingling of diverse ethnic strains: southern “Anglo,” black, German and Central European (especially prevalent throughout the southern part of the state), Mexican, and Louisiana Cajun (in the area extending from Beaumont to Houston). A passion for dancing was common among all these groups, and in this heterogeneous society, musical styles and songs flowed freely from one group to another, modifying the old southern rural styles. While rural music was prevalent and pervasive, it differed substantially from that produced in the Southeast or in the Deep South.

The discovery of oil at Spindletop, near Beaumont, in 1901 was the first of a series of finds in southeastern Texas, southwestern Louisiana, Oklahoma, and Arkansas in the years extending up through World War I. The discovery of the great East Texas oil field in the early 1930s, along with the rapid industrialization that began during World War II, further set Texas apart from the other southern states. While these factors contributed to Texas’s uniqueness, they are probably less important than the fact that it was also part of the West. In fact, to most Americans, Texas was and is the West. And this West was a glorious land peopled by cowboys.

The romantic concept of the West, shared by most Americans, has a history virtually as old as the nation itself. James Fenimore Cooper’s early novels describing the restorative qualities of the frontier were not substantially different, nor less romantic, than the themes emphasized later in Bret Harte’s stories, in the western “dime novels,” or in such books as Owen Wister’s The Virginian. Thus, the cowboy and the West had been bathed in romance long before Hollywood and the television industry began their exploitations of the theme. The American people also had long demonstrated a general interest in the songs of the cowboy ─ beginning with Nathan Howard Thorp’s Songs of the Cowboys, 1908, and John A. Lomax’s Cowboy Songs and Other Frontier Ballads, 1910 (as a matter of fact, as early as 1907, when “San Antonio” appeared, Tin Pan Alley tunesmiths had experimented with “cowboy” themes). Although a few concert-musicians such as Oscar Fox (from Burnet, Texas) and David Guion (from Ballinger, Texas) made classical arrangements of a few cowboy songs, the western theme did not make any significant impact on American music until the 1930s. Guion’s version of “Home on the Range,” first performed in 1930 in a New York play called “Prairie Echoes,” became the most popular arrangement of the song and was said, perhaps apocryphally, to be President Franklin Roosevelt’s favorite song. Such songs became so widely circulated in the 1930s that even Tin Pan Alley reverberated with the melodies of the range. The farther Americans became removed from the cowboy past, the more intense became their interest in cowboy songs and lore. Hillbilly singers and musicians did much to implant the romantic cowboy image in the minds of their American audiences.

Before the 1930s, a few musicians recorded songs that genuinely reflected the cowboy heritage. The concert singer Bentley Ball ─ who did many programs of patriotic and traditional songs, many of them in colleges ─ recorded “The Dying Cowboy” and “Jesse James” for Columbia in 1919. Charles Nabell, in November 1924, recorded some cowboy songs for Okeh, along with other types of traditional material. Several of the early cowboy singers came from Texas, and their songs, for the most part, reflected genuine cowboy experience. Carl Sprague, for example, may have done most to generate an immediate interest in the recorded songs of the cowboy. He grew up on a South Texas ranch near Alvin where he learned many of the songs (most of them from his cowboy uncle) that he later recorded for Victor. His 1925 recordings of cowboy songs — topped off by the immensely popular “When the Work’s All Done This Fall” — mark him as one of America’s first singing cowboys. While attending Texas A&M, Vernon Dalhart’s success as a singer of traditional songs convinced Sprague that a similar market for cowboy singers might exist. He traveled to New York and had a successful audition with Victor Records; his earliest recordings had a sound very similar to that of Dalhart, including guitar and studio violin. Singing, however, was never more than a hobby with Sprague, and aside from his recordings, he made few commercial appearances. For many years he was on the coaching staff at Texas A&M, and, in addition, he attained the rank of major in the United States Army.

The romantic concept of the West, shared by most Americans, has a history virtually as old as the nation itself.

Jules Verne Allen, on the other hand, had actually experienced the rugged life of a working cowboy before he embarked on his career as a radio singer. Born in Waxahachie, Texas, Allen began working cattle in Jack County, west of Fort Worth, at the age of ten. From 1893 to 1907 he worked as a rough string rider and bronco buster from the Rio Grande to the Montana line. Unlike Sprague, he used cowboy music as the basis for a professional career. During the 1920s and 1930s, Allen sang over numerous radio stations, including WOAI in San Antonio, where he performed as “Longhorn Luke.” Like most of the pioneer recording performers of the 1920s, Allen and Sprague drew most of their material from turn-of-the-century cowboy life, although some of their songs were learned directly from the Lomax collection.

Other cowboy singers of the early commercial period varied widely in the amount of actual range experience they possessed. The Cartwright Brothers (Bernard and Jack) grew up in Boerne, Texas, directly on the route of “the long drive” that proceeded on to Kansas. Essentially a fiddle band, the Cartwrights performed a variety of songs. Their version of “Texas Rangers,” however ─ marked by Bernard’s haunting fiddle ─ is one of the greatest performances of a cowboy song heard on early commercial records. Carmen William “Curley” Fletcher, from California, was a rodeo performer and itinerant hawker of songs long before he made any commercial recordings. His greatest claim to fame came through his writing in 1915 of the poem that became the basis for “The Strawberry Roan,” which he sold on broadside sheets. The song became one of the most popular western numbers, performed usually with a chorus added by the California radio singers Fred Howard and Nat Vincent. At least a couple of the pioneer cowboy singers, Goebel Reeves and Harry McClintock, were southerners whose wanderlust drew them west, where they worked at a wide variety of occupations. Both men, for example, spent some time in the famous radical labor union the Industrial Workers of the World (IWW, or Wobblies).

Our knowledge of the otherwise shadowy figure of Goebel Reeves comes from the pioneering research done by Fred Hoeptner. Known as “the Texas Drifter,” Reeves was born in Sherman, Texas, in 1899. Before his death in California in 1959, he had enjoyed a varied career that led him across the United States and around the world. Although he came from a respectable middle-class family (his father served in the Texas legislature), Reeves deliberately chose the life of a hobo. During the course of his wanderings, he enlisted in the army, saw front-line service in World War I, worked as a merchant seaman, became active in the IWW, toured the vaudeville circuit, performed on radio, and recorded under several names for such companies as Okeh and Brunswick. In his recording career as a singer and yodeler ─ he claimed to have taught Jimmie Rodgers the yodeling style in the early 1920s while living in New Orleans ─ Reeves introduced some of the most interesting examples of both cowboy and hobo songs found in American music. These included the well-known “Hobo’s Lullaby” (which he claimed to have written), “The Hobo and the Cop,” “Railroad Boomer,” and the cowboy songs “Bright Sherman Valley” and “The Cowboy’s Prayer.”

Harry McClintock was as well traveled as Reeves, having also been a merchant seaman, a soldier, and a hobo. Born in Knoxville, Tennessee, he roamed widely throughout the United States and became a member of the IWW in the early twentieth century. Because of his musical talents, McClintock was a welcome addition to the Wobblies, who had a well-known fondness for singing and whose Little Red Songbook became virtually the bible for labor/protest singers in America. McClintock’s claim that he wrote “Hallelujah, I’m a Bum” and “Big Rock Candy Mountain,” two of the world’s most famous hobo songs, has never been seriously challenged. Once he settled down from his wanderings, McClintock began a career as a radio cowboy singer as early as 1925 on KFRC in San Francisco. “Haywire Mac,” as he was often called, also recorded for Victor from 1927 to 1931. Along with superbly performed cowboy songs such as “Sam Bass,” “Jesse James,” and “Texas Rangers,” McClintock’s labor songs make him one of the important progenitors of western music.

John White and Otto Gray contributed to the shaping of western music by presenting it widely to a national audience. White was an unlikely “westerner,” hailing from Washington, DC. However, he was the first person to introduce cowboy songs on radio to a New York audience (on NBC from 1927 to 1936). He also recorded cowboy songs, as well as hillbilly material, from 1929 to 1931, under several pseudonyms including “the Lonesome Cowboy.” White specialized in the history of cowboy songs, and over the years he did more than any other person to describe the origins of the ballads, and he dispelled much of the romantic claptrap that had gathered around them.

Otto Gray, a prosperous rancher from near Stillwater, Oklahoma, pioneered in the commercialization of cowboy music. In about 1923, he assumed the leadership of a string band that earlier had been composed of real cowboys ─ the McGinty Cowboys (named for Billy McGinty, an Oklahoma rodeo performer). Gray’s group had the distinction of being one of the few country groups publicized in Billboard, although Gray paid for most of the advertising. From 1928 to 1932, Gray and his Oklahoma Cowboys made a tour of radio stations throughout the country and performed on the northeastern RKO vaudeville circuit. Momie Gray (Otto’s wife) was the featured singer of the organization, specializing in sentimental songs. The Oklahoma Cowboys were a highly professional group that possessed most of the characteristics of slick show-business organizations. A special publicity man traveled in advance of the group, and appearances on radio stations provided further exposure. Two agencies, the Weber-Simon Agency in New York and the William Jacobs Agency in Chicago, handled the group’s RKO bookings. The Gray performers, dressed in plain, western-style clothing, traveled in Gray’s $20,000 custom-built automobile, which was wired for sound reproduction and had a radio receiver and transmitter.

If Otto Gray contributed significantly to the commercialization of “western” music, Jimmie Rodgers played an equally important role in fusing it with country music. As discussed earlier, Rodgers spent the last few years of his life in Texas and conducted many of his most successful tours there. He took great pride in the Texas heritage and the romantic cowboy past. The modern concepts of the “singing cowboy” and of “western” music may very well date back directly to Rodgers.

Scores of singers who modeled themselves after Jimmie Rodgers emerged in the 1930s, and most of them gave themselves “cowboy” titles and dressed in western attire. Young Hank Snow, for example, in far-off Nova Scotia, dressed in cowboy regalia and called himself “the Yodeling Ranger.” In even more remote Australia, Robert William Lane performed under the name of Tex Morton, described himself as “the Boundary Rider,” and sang cowboy songs with a bizarre, trilling yodel about both the Australian bush and the Texas Plains. Others, like Ernest Tubb, included few cowboy songs in their repertories but wore cowboy boots and ten-gallon hats. Since the western attraction was irresistible, even young hillbilly singers from the Deep South or from the southeastern mountains, whose associations with cowboys came only through story and song, embraced the western image and imagined themselves “way out west in Texas for the roundup in the spring.”

Perhaps because of Rodgers’s close association with Texas, many of the successful Texas hillbilly performers ─ Ernest Tubb, Lefty Frizzell, Floyd Tillman, Bob Wills, Tommy Duncan ─ credited Jimmie Rodgers as their inspiration. One of the most important of these individuals, and the one who completed the “romantic westernizing” process begun by Rodgers, was Orvon Gene Autry. Autry owed most of his initial success to the fact that he could perform Rodgers’s repertory in Rodgers’s yodeling style. Autry was born on a horse farm near Tioga, Texas, on September 29, 1907, but moved to Oklahoma with his parents while in his teens. Although his father was a horse trader, one finds that Gene experienced little of the cattle ranch life that his promotional material later stressed. At any rate, he left the “ranching” life as quickly as he could, working as a railroad telegrapher and singing at every opportunity.

According to a much-repeated story, confirmed by Autry himself, Will Rogers inspired his decision to become a professional musician. One day in 1927 the great humorist came to Chelsea, Oklahoma, where Autry was working as a telegrapher for the St. Louis and Frisco Railroad, heard the young man singing and strumming his guitar, and strongly encouraged him to go to New York and become a professional. Autry’s first trip to the big city in 1927 was unsuccessful, but he returned to Tulsa and got a job on KVOO as “the Oklahoma Yodeling Cowboy.” Returning to New York in 1929, he made his first records for Victor, accompanied by the Marvin Brothers, Johnny and Frankie. In December of the same year, Autry began a crucial association with Arthur Satherley, who recorded him for the American Record Company (ARC), producer of records for chain stores and for Sears. It was through the association with the Sears Conqueror label that Autry made it to WLS and the National Barn Dance.

In Chicago after 1931, Autry was an immediate success. His appearances on the Barn Dance and on his own radio program, Conqueror Record Time, made him one of the most popular performers in WLS history. His records, released on Sears labels, were those most prominently displayed in the Sears-Roebuck catalogue. As a result of his growing popularity, a number of Gene Autry songbooks and guitar instruction books began to appear in the early 1930s. An ad for a Gene Autry “Roundup” Guitar, priced at $9.95, reminded the reader that Autry had become a famous performer “simply because he learned to play a guitar while on the ranch.” Autry’s promotional mentors, Art Satherley and Ann Williams of the WLS production staff, capitalized on the “western” motif and advertised him as a singing cowboy long before the bulk of his recorded repertory came to include western numbers.

With Autry ensconced as a singing movie cowboy, hillbilly music now had a new medium through which to popularize itself.

In his early years as a professional singer, and on through the WLS period from 1931 to 1934, Autry remained a hillbilly singer, only rarely singing anything of a western variety. In both song selection and in style of performance, he revealed his indebtedness to the southern rural tradition. His Jimmie Rodgers imitations were among the best in country music, and his own “compositions” (written or cowritten with people like Jimmie Long) included such songs as “A Gangster’s Warning,” “A Hillbilly Wedding in June,” “Gosh, I Miss You All the Time,” and “My Old Pal of Yesterday.” In 1931, he recorded one of the biggest-selling hits in hillbilly music’s then-short history, “That Silver Haired Daddy of Mine,” recorded as a duet with the song’s co-composer, Jimmie Long. Autry’s many and varied recorded selections even included at least one labor song: “The Death of Mother Jones,” recorded on at least seven labels, which applauded the life of the famous and radical labor leader. While the song seemed rather remote from the type one would expect from a cowboy singer, it nevertheless reflected the passion for social and economic justice that many people felt during these Depression years.

Autry’s success on the Chicago radio stations and on record labels gained him in 1934 the position that made him the best-known cowboy in the United States and one of the most famous hillbilly singers. In that year, he arrived in Hollywood and began his career as the “Nation’s Number One Singing Cowboy.” Beginning with a small part in Ken Maynard’s In Old Santa Fe, he then starred for thirteen episodes in a strange cowboy/science-fiction serial called The Phantom Empire. Autry went on to a featured role in 1935 in Tumbling Tumbleweeds, a film that also included his old sidekick from Chicago days, Lester Alvin “Smiley” Burnette. In the following decades, he made more than ninety movies for Republic, Columbia, and Mascot, eighty-one of which included the multitalented Burnette, who usually played a bumbling character, Frog Millhouse. While becoming one of the most popular and wealthy actors in Hollywood, Autry also created the stereotype of the heroic cowboy who was equally adept with gun and guitar. Autry was not the first individual to sing in a western movie ─ Ken Maynard had done so as early as 1930 ─ but he was the first to institutionalize the phenomenon. With Autry ensconced as a singing movie cowboy, hillbilly music now had a new medium through which to popularize itself. The silver screen further romanticized the cowboy and helped shape the public idea of western music.

After signing his Hollywood contract, Autry made a radical shift in his repertory from “country” themes to “western” motifs. Instead of singing songs about the mountains, he came increasingly to perform songs with such titles as “Ridin’ Down the Canyon,” “The Round-up in Cheyenne,” and “Empty Cot in the Bunkhouse.” Both in Autry’s singing and in the instrumentation that accompanied him, one hears a distinctly measurable change in the records he made from 1929 to 1939. As the one-time hillbilly singer reached out to a larger audience, he smoothed out his presentation of material with a lower vocal pitch, well-rounded tones, and honey-coated articulation. Instrumentally, Autry’s sound exhibited a similar evolution, particularly after the violinist Carl Cotner became his musical director. Soft guitars, muted violins, a melodious but unobtrusive steel guitar, an accordion, and occasionally even horns could be heard as background instrumentation, as he and his directors sought a sound that would give no offense to America’s broad urban middle class. Whatever vocal sound was featured, however, Autry demonstrated a mastery of it. No country singer has ever shown more versatility.

Autry’s popularity inspired other movie companies to present their own versions of the singing cowboy. In searching for likely candidates, the companies usually delved into the ranks of country music, acquiring acts that had already established themselves on hillbilly radio shows or on record labels. Following Smiley Burnette, the Light Crust Doughboys became the first country group to join Autry in a movie (Oh, Susanna!). Some Autry sidemen went on to become important entertainment personalities in their own right. Johnny Bond, Jimmy Wakely, and Dick Reinhart, for example, came to Hollywood in 1940 (as the Jimmy Wakely Trio) and joined Autry’s Melody Ranch radio show in September of that year. Reinhart became one of the early exponents of the honky-tonk style, with songs like “Fort Worth Jail” and “Truck Driver’s Coffee Stop.” Wakely eventually starred in many movies of his own, became one of country music’s smoothest singers, and made several seminal recordings, such as “One Has My Name (The Other Has My Heart)” (one of the first successful “cheating” songs in country music). Bond remained on the Melody Ranch program until it ended in 1956, playing the role of a comic sidekick and opening the show each Sunday with the bass guitar run introduction to “Back in the Saddle Again.” Bond also became one of country music’s greatest songwriters, creating such songs as “Cimarron” (a song about a small river in Oklahoma, and performed by all western groups), “I’ll Step Aside,” “Old Love Letters,” and “I Wonder Where You Are Tonight” (now a standard in both bluegrass and mainstream country music).

A long line of hillbilly singers made only occasional appearances in western movies, usually as supporting actors for such leading cowboy stars as Charles Starrett and Johnny Mack Brown. The Sons of the Pioneers appeared in numerous movies, while Bob Wills and his Texas Playboys were in about eight. A few singers, such as Ernest Tubb, Jimmie Davis, and Bill Callahan, made only rare appearances.

Other singers, however, became leading men and posed at least modest challenges to Autry’s dominance. Atlanta-born Ray Whitley, the writer of “Back in the Saddle Again” and the designer of one of country music’s most popular guitars, the Gibson SJ-200, became a movie star in 1936 after an earlier successful career in New York as a cowboy singer. Tex Ritter also began his movie career in 1936, and, in the fifty-six movies that he eventually made, he became the most believable of all the singing cowboys. The most successful challenge to Autry, though, came from Roy Rogers, who signed with Republic in 1937. His visibility in American public life would last, because of television, well into the 1960s. The singing cowboy genre also persisted in American movies on into the 1950s, with Arizona-born Rex Allen being its chief exponent after 1949. In many ways, this last singing cowboy was the best singer of them all. Allen’s rich voice ranged from a deep bass to a sweeping tenor ─ a sound that almost no other country singer could equal.

Largely as a result of Hollywood exploitation, the concept of “western music” became fixed in the public mind. After the heyday of Gene Autry, the term “western” came to be applied even to southern rural music by an increasing number of people, especially by those who were ashamed to use the pejorative term “hillbilly.” Not only did the public accept the projection, but even most hillbilly singers became fascinated with the western image and eventually came to believe their own symbols. Autry was the first of a long line of country singers who clothed themselves in tailored cowboy attire; in the following decades, the costuming became increasingly elaborate and gaudy, with the brightly colored, bespangled, and rhinestone-laden uniforms created by Nudie the Tailor (Nudie Cohn, born Nuta Kotlyarenko in the Ukraine in 1902) in Los Angeles being the most favored fare. Eventually, most country performers, whether they hailed from Virginia or Mississippi, adopted cowboy regalia–usually of the gaudy, dude cowboy variety.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Along with the clothing, country bands and singers ─ particularly in the Southwest and on the West Coast ─ adopted cowboy titles. Singers with names like Tex, Slim, Hank, Red River Dave, the Utah Cowboy, and Patsy Montana, and groups with such titles as the Cowboy Ramblers, Riders of the Purple Sage, Radio Cowboys, Swift Jewel Cowboys, Lone Star Cowboys, and Girls of the Golden West (Dolly and Millie Good) abounded on radio stations (and record labels) all over the nation. Radio and record promoters, of course, were very much alive to the appeal of the western myth, and they often encouraged musicians to adopt appropriate western monikers. Millie and Dolly Good, for example, were farm girls from Illinois who sang and yodeled in sweet, close harmony. Their agent advised them to dress like cowgirls, gave them the romantic title Girls of the Golden West, and then, after scanning the map of western Texas, attached to their promotional literature the statement that they were born in Muleshoe, Texas. The Girls very carefully preserved this fiction to the end of their performing career.

Patsy Montana’s career was similarly shaped by romantic conceptions of the West. She was a singer and a fiddler from Arkansas named Rubye Blevins, but on the West Coast in the early 1930s, Stuart Hamblen renamed her Patsy Montana, and she thereafter cultivated the performing image of the cowgirl. Although much of her career saw her appearing as a “girl singer” with such groups as the Prairie Ramblers, Patsy made dramatic history in 1935 when “I Want to Be a Cowboy’s Sweetheart” became the first huge hit by a woman country singer and a virtuoso yodeling piece that still influences the style of women singers (Austin country-rock singer Marcia Ball, for example, made the song and yodel standard parts of her repertory in the late 1970s).

Many of the “western” entertainers performed cowboy songs, usually highly romanticized, but more often their titles and attire were the only ties they had with the “West.” Several musicians, however, stayed rather close to the cowboy repertory. Some of them had been performing long before Gene Autry achieved Hollywood fame, and many of them, such as “Haywire Mac” McClintock and the Crockett Family (John H. “Dad” Crockett and his five sons, originally from West Virginia), had performed on California radio stations since at least 1925. Other early California groups included Len Nash and his Original Country Boys, broadcasting from KFWB, Hollywood, as early as March 1926; Sheriff Loyal Underwood’s Arizona Wranglers; Charlie Marshall and his Mavericks; and perhaps the most important (and certainly the most interesting), the Beverly Hillbillies.

Largely as a result of Hollywood exploitation, the concept of “western music” became fixed in the public mind.

The Beverly Hillbillies were the brainchild of Glen Rice, station manager at KMPC in Los Angeles. Reversing the trend toward adoption of western names during the 1930s, Rice used the eastern moniker Hillbillies for the group of western musicians that he assembled around the accordion player Leo Mannes (renamed Zeke Manners) and conducted a ballyhoo campaign alleging that a group of strange and primitive musicians had been unearthed in the hills of Beverly. The band made its debut on KMPC on April 6, 1930, and remained a popular feature throughout the decade. Over the years the Hillbillies included several fine musicians, such as Manners, who had no background in country music but had been attracted to California because of the lure of Hollywood. A few Hillbillies were genuine country boys, such as the sky-high yodeler Elton Britt (James Britt Baker), who came from Arkansas in 1930, and Stuart Hamblen, who came from Texas in the same year. Britt went on to become one of country music’s most gifted yodelers (virtually the last of that once-hardy breed) and a leading soloist during the 1940s. Hamblen, the son of a Methodist minister in Abilene, Texas, was a fixture on West Coast radio from 1930 to the 1950s. He hosted his own shows in Hollywood after 1931, boosted the careers of other performers, wrote many of the most successful songs of the decade (including “My Mary,” “Texas Plains,” “Golden River,” and “My Brown-Eyed Texas Rose”), was the first country performer signed by Decca in 1934, and became sufficiently known to become a candidate for Congress in 1938.

The western group that ultimately became the most famous, and the most frequently emulated, was the Sons of the Pioneers. They sang virtually every type of country song and even ventured into popular music, but the majority of their melodies dealt with western themes. Perhaps more than any other group, they preserved a western repertory and exploited the romantic cowboy image. More “western” stylistically than any other group, they were among the least western in terms of origin. Bob Nolan (Robert Clarence Nobles) was born in New Brunswick, Canada, but he moved with his parents to Tucson at the age of fourteen. In Tucson he found himself fascinated with the desert, a feeling that never left him and eventually inspired some of country music’s greatest songs, such as “Cool Water,” “Tumbling Tumbleweeds,” and “At the Rainbow’s End.” Tim Spencer, also an outstanding songwriter, was born in Missouri but grew up in Oklahoma, Texas, and New Mexico. Roy Rogers came from southern Ohio.

The three musicians came to California in the early 1930s and soon fell into a pattern common to most country singers during the decade, moving from group to group before they formed their own organization. Roy Rogers, the prime organizer of the trio, was born Leonard Slye in Cincinnati, on November 5, 1911, but grew up on a small farm near Portsmouth, in southern Ohio. Here he garnered his earliest musical training from his Kentucky-born mother and his mandolin-and-guitar-playing father. In 1931 he and his father moved to Tulare, California, and worked as migratory fruit pickers. In the following three years, beginning with a duo called the Slye Brothers (Leonard and a cousin), he worked with several western-style groups until the Pioneer Trio was formed in 1933. Renamed the Sons of the Pioneers the following year, the trio soon became noted for their smooth, inventive harmonies and yodeling, and for the finely crafted songs that Nolan and Spencer created. They became so famous for their harmony that their instrumental accompaniment is often forgotten. Two extraordinarily talented brothers from Llano, Texas, Hugh and Karl Farr, joined them in 1934 and 1935. The Farrs were jazz-influenced country musicians whose progressive styles were sometimes obscured by the vocal emphasis of the Pioneers. Hugh Farr, who also sang a low-down bass with the group, was one of the hottest fiddlers of the period, and his brother, Karl, was a master of both the rhythm and single-string styles of guitar.

The Pioneers won extensive popularity on the West Coast with an early-morning radio program on KFWB in Hollywood, but 1936 proved to be their banner year. By this time their radio transcriptions were being widely circulated, and the group became a featured act, along with Will Rogers, at the Texas Centennial in Dallas. Leonard Slye left the group in 1937 after signing a movie contract with Republic Studios. At this point he changed his name, first to Dick Weston, and later to Roy Rogers. His performances after this time were made on an individual basis, and he eventually rivaled Gene Autry as America’s most popular singing cowboy (Rogers was also one of country music’s finest yodelers). He was replaced in the Sons of the Pioneers by Lloyd Perryman from Ruth, Arkansas, whose natural tenor was the first the group had ever had, and who gave them an even closer harmony than they had earlier possessed. The Sons of the Pioneers underwent numerous personnel changes after 1937 but have never disbanded. Their songs moved into the repertories of country singers everywhere, and their style of harmony was widely copied, most effectively by Foy Willing (originally Willingham) and the Riders of the Purple Sage, who appeared with Monte Hale and Roy Rogers in Republic Pictures from 1942 to 1952.

The flourishing singing cowboy industry inspired the emergence of songwriters, including two of country music’s finest ─ Fred Rose and Cindy Walker ─ who made their debuts as country composers in the 1940s when they wrote songs for movies (Rose for Autry, Walker for Bob Wills). The interest in western music in the 1930s, however, was not confined to country performers and their supporters. Writers from Tin Pan Alley also reacted to the western craze, and the entire nation was soon humming western-style tunes such as “Gold Mine in the Sky,” “There’s a Home in Wyoming,” and “I’m an Old Cowhand.” Some of these tunes were written by easterners who had never been near a cow, but the Happy Chappies at least lived in California in the midst of the Hollywood industry. The Chappies were a pop-singing duo named Nat Vincent and Fred Howard who wrote or arranged such songs as “When the Bloom Is on the Sage,” “Mellow Mountain Moon,” “My Pretty Quadroon,” and “Strawberry Roan” (the last a musical adaptation of Curley Fletcher’s earlier poem). The most successful of the western-oriented popular songwriters was a Bostonian, William J. (Billy) Hill. Hill’s birth and musical training gave no indication of his future success as a western songwriter. Born in Boston in 1899, he studied violin at the New England Conservatory of Music and performed for a short time with the Boston Symphony Orchestra. In 1916 he traveled west, riding the rails and working at odd jobs until he had seen most of the western states. He returned to New York in the late 1920s after becoming thoroughly acquainted with western life ─ including everything from camp cooking to cowpunching. In New York he worked as a doorman at a fashionable hotel and composed songs occasionally. Over the years his compositions ranged from popular melodies like “The Glory of Love” to hillbilly songs like “They Cut Down the Old Pine Tree” and “The Old Spinning Wheel.” His chief success, however, came with western-style songs like “Call of the Canyon,” which were distinguished for their beautiful melodies and for rhythms that suggested the gait of a horse. He experienced his most spectacular success in 1933 with “The Last Roundup,” the song that really awakened the general public to the romantic West while becoming the most popular tune in the country. Performed by both hillbilly and popular groups, its appeal may have stimulated a greater interest in the more “authentic” country and western material and ensured a greater national following for country music.

Most of the western bands in California and the Southwest used Billy Hill’s material, but his New York songwriting ventures were directed primarily at big-city popular-music audiences. Although country music has always encountered its coolest reception in the Northeast, particularly in the city of New York, country-style entertainers have always achieved some prominence there on local radio stations. Ethel Park Richardson, for example, did much to educate New Yorkers about the beauties of folk culture between 1933 and 1935 with her weekly dramatizations on WOR and the NBC Network. Each week she was assisted by such singers as Frank Luther, Carson Robison, and Tex Ritter as she dramatized a famous folk song. Luther and Robison had been in New York since the 1920s, but Ritter was one of several cowboy singers who kept New Yorkers range conscious during the mid-1930s. Others included Texas Jim Robertson, a deep-bass singer from Batesville, Texas; Zeke Manners and Elton Britt, who had moved from California; Dwight Butcher, a Jimmie Rodgers disciple from Tennessee; Ray Whitley, who sang regularly at the Stork Club and on WMCA; and Wilf Carter, the Nova Scotia yodeler who performed over CBS as Montana Slim.

The most singular of all the cowboy singers in New York, however, was Woodward Maurice “Tex” Ritter. Born in Murvaul, in deep East Texas, January 12, 1905, Ritter grew up far removed from the scene of much cowboy activity. He attended the University of Texas for five years (singing in the university glee club under the direction of Oscar Fox) and then went to Northwestern Law School for one year. Throughout his youth he had collected western and mountain songs, and therefore had a storehouse of interesting songs when he began singing on KPRC in Houston in 1929. In 1930, he joined a musical troupe on a series of one-night stands through the South and Midwest. By 1931, he had gone to New York, where he joined the Theatre Guild and began his acting career with a featured role in Green Grow the Lilacs (a short-lived play that eventually became the basis for the musical Oklahoma). With his thick Texas accent and storehouse of cowboy lore, Ritter quickly emerged as a New York sensation. He became greatly in demand for lecture recitals in eastern colleges on the cowboy and his song. During the fall of 1932, he was the featured singer with the Madison Square Garden Rodeo and from there went on to a recording contract with ARC and a program slot on WOR entitled The Lone Star Rangers, one of the first western radio shows ever featured in New York City. From 1932 to 1936, he appeared on other New York stations, including the WHN Barn Dance, where he acted as cohost with Ray Whitley. Then, inevitably, in 1936, he made the first of several movies, Song of the Gringo. Ritter, however, was not a cowboy, but was instead a very believable interpreter of cowboy songs. Impressionable easterners were easily convinced that he came, not from a small East Texas community and a college background, but from a working cattle ranch. And Tex very skillfully lived up to the part.

Tex Ritter’s exploitation of the western theme was typical of what was happening all over the United States in the mid-1930s. From New York to California, individuals responded to the western myth, and “cowboy” singers and groups sprang up in all sorts of unusual places. “Western” became a rival and often preferred term to “hillbilly” as a proper appellation for country music. It is easy to understand, of course, why “western” would be preferred to the seemingly disreputable backwoods term. “Western,” specifically, suggested a music that had been developed by cowboys out on the Texas Plains or in the High Sierras; more generally, it suggested a context that was open, free, and expansive. In short, the term fit the American self-concept.

***

Listen to music writer Will Hermes’ interview with Bill Malone and Tracey Laird on the Longreads Podcast here (read as transcript).

Excerpted from Country Music USA. Copyright ©1968 by the American Folklore Society. Copyright © 1985, 2002, 2010, 2018 by the University of Texas Press. All rights reserved.

Smooth Spaces, Fuzzy Lives

Brian Lawless/PA Wire

Rachel Andrews | Brick | Summer 2018 | 18 minutes (4,831 words)

A photograph in an Irish newspaper depicts a member of the Garda Síochána shaking hands with his counterpart from the Police Service of Northern Ireland at one of the points where the territory of the Republic turns into that of Northern Ireland. The photograph, published in November 2015, seven months before Britain voted to exit the European Union, accompanies an article on plans for a “border corridor,” whereby police on both sides of the border can pursue fleeing criminals into each other’s region.

There’s a kind of joviality to the photograph: firm clasping of hands, big smiles. Behind the two men is the Irish landscape, rolling, misted, a river cutting through fields of green. The officers wear different uniforms, but the only obvious territorial demarcation, the only hint that they inhabit different countries, with different laws, health systems, and currencies, is a sharp change in road color, from black to sudden grey.

I remember this non-distinctiveness, the dawning awareness that I had crossed a boundary, from the many trips I took to Northern Ireland between 2007 and 2010, when I worked on an essay that documented the systematic demolition of the Maze prison, a story that presented itself symbolically and — as it turned out — all too simplistically as one of a settling of the past and a coming together for the future.

I never went North as a child. I remember a drawing in a newspaper depicting a map of Ireland. In the sliver of space that is Northern Ireland, the cartoonist had penned: “there be dragons.” In truth, it was worse than that. Ask me as a 6-year-old, a 12-year-old, about Northern Ireland and I would have responded: bombs and blood. Ask my young daughter today and she might look at you blankly. It means nothing to her, and that is a good thing.

There were ways it meant nothing to us too. I grew up in Cork, in the very south of Ireland, and that meant growing up a world away from bombs and blood. As children in the 1970s and 1980s, we were safe from soldiers in the back gardens, from streets we couldn’t walk down. But things filtered into our child worlds. From television: the dark loom of the watchtower, the helicopters, the aerial prison shots following the 1983 Maze escape; Gordon Wilson, who lost his 20-year-old daughter in the 1987 Enniskillen bombing: “I shall pray for those people tonight and every night.” Of the few discussions in school, I remember one: the classmate who had relatives in Belfast, and her upset, her anger, at our fear, our distancing and distaste.

As I got older and traveled in Europe, the easy comfort of that distancing — you and I are not alike — was undercut. “So where did you hide the bomb?” a French colleague joked when I worked for a summer at a hotel in Munich. “Until I met you, I thought all Irish people were savages,” a German girl told me during my Erasmus year in France. This was the early to mid-90s and everywhere I went, there it was. “We in Australia just can’t understand it,” said the visitor to my apartment in London. I still remember the insult of his bemusement and sincerity, as well as my own avoidance. As far as everyone on the outside was concerned, I was them and they were me. I knew better — I mean, was it not obvious?

The first time I went North was in 2000, two years after the Good Friday Agreement, after the Omagh bombing and the howl of anguish that went with it, after it became imaginable, almost normal, for me to drive in my tiny Southern-registered Fiat from Dublin to Belfast and back again as I researched a writing project on women working in politics in Northern Ireland. I was in my late 20s by then, and I wasn’t afraid in Belfast’s city center, which had the same familiar department store names as any British or Irish main street; nor on the Falls Road, which wrapped me in a warm blanket of tri-colors and Celtic symbols; but I felt heavy and intimidated as I made my way up the red, white, and blue pavements on Shankill Road to the offices of the Progressive Unionist Party, hesitant to speak in the corner store lest I betray myself through the soft spill of my Southern tones. But then this dissolved too, and seven years later, when I spent time interviewing former prison officers at the Maze, as well as the residents who had grown up beside the prison, all from a Unionist background, the sing of my Cork accent felt more like a benign curiosity than anything traitorous or threatening.

“Merging is dangerous,” writes Rebecca Solnit, “at least to the boundaries and definition of the self.” Is that why we wrestle against it so? The border with Northern Ireland, once a site of blocked roads and lookout towers, has evaporated, at least on the surface of things, but it remains a place of struggle, of contest, a tussle between those who wish to take it one way and those who would move in another direction, either within the boundaries of a unified Ireland or into the space of clarity that tells us where we end and they begin. The amorphous situation that has existed along the border for nearly twenty years, a fudge that has resisted discrete categorizations and that we seemed to have found a way to live with, or live within, is under pressure in the wake of the Brexit vote, as the clamor to once again define what we are and what we are not, begins to accelerate. We look for the solace of certainty, of knowing if we are one thing or the other, rather than allow ourselves to remain within the complicated, messy space of the both/and, a state made possible by the exhaustion left after thirty years of violence.

Hannah Arendt had a particular view of merging. As she searched out a meaningful concept of a Jew’s place in the world following the sundering caused by the Second World War, she ultimately rejected a form of Zionism that connected citizenship to ethnicity and tethered both to the boundaries of the nation-state. On the other hand, she wrote scathingly of those European Jews who would assimilate, who would ape the Gentiles in an effort to find their way into the ranks of the human, who would, she wrote in disgust, become “good Frenchmen in France,” “good Germans in Germany.” Arendt, you could say, had been one such good German. As a child she did not know that she or her family were Jewish; she learned of her ethnicity only through the anti-Semitic taunts of children on the street. But it was the shocking stripping of her German citizenship as an adult in the 1930s that ultimately woke her to the helpless vulnerability of the assimilated Jew and formed her conviction that Jews must stand defiantly aloof from the boundaries of nationality, turning instead toward the belonging of the citizen; the belonging that attaches to full and complete membership of a political community; the belonging that confers the right to meaningfully speak, act, and be heard in such a community; the belonging that means inhabiting a territory without subscribing to an overarching identity narrative. “Refugees driven from country to country represent the vanguard of their peoples — if they keep their identity,” she wrote. Today her sentiments do not appear so different from those of Dina Nayeri, an Iranian refugee who received U.S. citizenship at 15 and became a French national at 30, and who wrote in the Guardian that she had lost interest in the need to rub out her face as tribute for these benefactions. “As refugees, we owed them our previous identity. We had to lay it at their door like an offering, and gleefully deny it to earn our place in this new country. There would be no straddling. No third culture here,” she said, although a third culture appears to be the choice made by Arendt’s beloved Heinrich Heine, at least as she described it, which was to live as both a German and a Jew rather than deny his Jewishness as the price of belonging. “He simply ignored the condition which had characterized emancipation everywhere in Europe — namely, that the Jew might only become a man when he ceased to be a Jew,” she wrote.

Arendt came out of a Europe that had, she witnessed, conclusively intertwined national rights with human rights, which left her as mistrustful of a national, bordered identity as she was of the “abstraction” of any solemn notions of the inalienable rights of man. Heine, the Prussian-born poet and literary critic, came of age in the early nineteenth century, an era of political instability and contentiousness in his homeland; his conversion to Lutheranism was reluctant, regretted, and carried out only as the price of “admission into European culture.” In the early years of the twenty-first century, there was a feeling —I had the feeling — that Europe, at least on some parts of its continent, had found its way beyond these aspects of its shattering history and was on the turn toward the global and the flexible. In 2002, when I lived for a time in Paris, I could board a plane in France and emerge in Italy, where I could retrieve my bags and leave the airport without showing any identification, without queues or questions. This identity-less travel, the result of the then seven-year-old Schengen Agreement and so opposite to my conditioned, normalized experience of waiting in dutiful lines, gave me the very real sense of being a human in the world. The continent of Europe — the part of it that now had a common currency and permeable frontiers, and even onwards toward the rapidly opening East — felt magical, enlightened even, as if we were all in this together. The distinctions between us, forged through cultural, religious, and geographic experience, appeared shapeless now. I could be both Irish and European; I felt that I could, as Arendt wrote, “speak the language of a free man and sing the songs of a natural one.”

But there was a “them.” From my window in Stalingrad, the quartier in the north of Paris where I lived from September to Christmas, I watched men in jeans and jackets congregate outside in the early darkness of the winter evenings, lining up in huddled rows on a Friday for weekly prayers. I looked on, curious — what are they doing? — before I understood. This was one year before the Iraq War, which fractured the Arab world, but already and for long years it was not easy to be Muslim in France, even if you were the French-born descendant of those who had come in the 1950s and 1960s as part of the first wave of migrant workers from northern Africa who stayed in search of a better life; even if you identified as both Muslim and French, as really all, or at least so many, of such descendants do, and as French civic society, with its emphasis on the primacy of the citoyen, encourages — in theory anyway. The exclus, they used to call them. The excluded. If I lived in Stalingrad today, the men across the street would no longer be there; in 2011, politicians banned the saying of street prayers in Paris following far-right protests about creeping Islamization. Instead, near the street I lived on, under the bridge where the metro station lay, there would almost certainly be tents and other makeshift shelters constructed by refugees from Iraq, Syria, Afghanistan, and elsewhere, part of a new and different wave of migration that, along with the 2008 economic crisis, has upended all of Europe. In 2002, I also went to Greece on a reporting assignment. There was no graffiti then comparing Angela Merkel to Hitler; today many in the desperately-indebted country view the dominance of German capital as the source of their woes. In Italy, France, Germany, a radicalized electorate now supports nationalist parties, looks at the European Union with deep suspicion. We were never all in this together.

For the moment, I can travel from Ireland to Britain without a passport. For the moment, I can drive from Dublin to Belfast without stopping, as the road melts from the N1 to the A1 and the white and black sign informs me that speed limits are now being monitored in miles rather than kilometers. (How different to John McGahern’s experience in the early 1990s, recounted in the essay “County Leitrim: The Sky Above Us”: “There are ramps and screens and barriers and a tall camouflaged lookout tower,” he said of the border crossing at Swanlinbar in County Cavan. “A line of cars waits behind a red light. A quick change to green signals each single car forward. In the middle of this maze armed soldiers call out the motor vehicle number and driver’s identification to a hidden computer. Only when they are cleared will the cars be waved through. Suspect vehicles are searched. The soldiers are young and courteous and look very professional.”) By the time I will have finished writing this article, British Prime Minister Theresa May will have triggered Article 50, and the movements I have become used to taking between cities and countries will have been thrown into confusion. Since the terrorist attacks of November 2015, France has been in a state of emergency that includes a firm policing of its borders. For more than a year and a half, commuters travelling from Malmö in Sweden to Copenhagen in Denmark had to present their IDs. Temporary border controls have also been introduced by Germany, Austria, and Norway. Merging is dangerous. Those hoping for a united Ireland — and I am surely one — forget this. On his blog, the journalist and Northern analyst Andy Pollak notes that Andrew Crawford, the former special adviser to current Democratic Unionist Party (DUP) leader Arlene Foster, used to go through reports from one North-South body removing the phrase “all-Ireland.” Perhaps the action of deletion helped Crawford forge certainty, was part of an attempt to make sense of how he existed in time and space. Forging certainty helps us all as we construct both story and identity in order to figure out how to live, but certainty, or at least a fixed destination, gets us into trouble too: we blind ourselves to possibilities, to the creative potential that lies outside of the either/or, to what can happen when we follow Arendt, say, or Deleuze, that great demolisher of dualisms, into the space of the non-being, the uncertain, the becoming.

In her photographic series Kinderwunsch, Ana Casas Broda depicts her body in thrall to those of her children, an artist willing to lose herself in conversation with flux, with change, with overwhelm. The photos are intimate and direct. Casas Broda often stares unsmilingly at the camera: a candid, life-worn Olympia, her pregnant body naked and big, uncomfortable-looking with her second child, or scarred and slack following fertility treatment and birth. In one of the images, her children have marked her face and torso with crayon; she both encouraged this and passively accepted the results. “I am their canvas: they play with me and change me,” she said in an interview. Kinderwunsch means “desire to have children,” and Casas Broda submits, it appears to me, to the terror and the unknown of that primal desire. She tumbles downwards, inwards. In the photographs, her children clothe her in tissue paper, they cover her in Play-Doh. “I see their scribbles on my body as a symbol of how motherhood has changed me,” she said. What she is really depicting is dissolution (of a former self), symbiosis — and something else. In some of the images, she and her children appear as one, interwoven, but there are others where she is alone, or they are indifferent to her: a son plays a video game as she lies naked on a couch, in between mother and person, neither here nor there, her body nonetheless relaxed, strangely at ease in the moment.

Around the time I began my Maze project, I was experiencing the greatest disintegration of self I had ever felt. Crossing the border from North to South represented moments of enormous exhilaration and giddy freedom: dazed as I was, when I lay in a border hotel without the baby, who had just turned seven months, I thought that I could see a way back to myself, that the place where I ended and the child began, would somehow become obvious again, clearly defined. I was wrong about that: there was no going backward. There was no going forward either, at least not in the way I wanted or imagined. Since the birth of my daughter, I remain in limbo land, the borders of a self so carefully constructed over nearly four decades now shifting. She arrived and I disappeared, something like that anyway. The categories I had thought surrounded me have dissipated into confusion and nothingness, and that, if I think about it too much, can be terrifying. Did I turn into you, I used to ask her when she was a baby, or have you become me?

When Colm Tóibín walked the border between North and South in 1987, he bumped into questioning British soldiers; a blown-up bridge on a road that once led from Dublin to Enniskillen; and, in Derry, a march led by former DUP leader Peter Robinson, then in the ascendant. Tóibín feared opening his mouth during the march, lest the crowd (young men in the main, some drunk) spot him as a Southerner. Despite the disappearance of the island’s physical frontier, the hangover from these tensions remains. My friend, a middle-class Northerner from a Catholic background who has lived in Dublin for nearly twenty years, at times employs turns of phrase that leave me reaching for a Cockney rhyming slang dictionary. Yet she and I both use colloquialisms a person born in England will never have heard. Nonetheless, for a long time my friend was lost and lonely in Dublin, reluctant to move back to a society still undercut by a deep lack of trust but without solid ground in the cultural space of the South; she felt different. “I was different,” she told me, as I tried to grasp her feelings of statelessness. It’s not as if we are from different countries, I told myself, not really anyway. But the thing is, we are, both literally and metaphorically. The border has dissolved, but trauma, so deep, so wounding, cuts us off from one another, makes strangers of us in the same land, pulls me one way, pushes her another; trauma turns a society inward, and it has turned Northern Ireland, in the words of retired Oxford professor of Irish history Roy Foster, into more of its own little place than ever. What we have in common is this (and this is easy to write and hard to live): we are more the same than we are different.

The artist Rita Duffy grew up Catholic in a largely Protestant area of Belfast; she is the progeny of a Southern mother and a father whose own father, a Catholic from the Falls Road, died at the Somme. Her two great-uncles on her mother’s side supported and may have been actively involved in the 1916 rebellion, which ultimately led to Irish independence, a civil war, and the fracturing of the island of Ireland. “I was continually fluctuating between nationalities, between identities, curious to know could I somehow land up in the middle somewhere that satisfied me today,” she told a symposium I attended in 2016. In recent years, Duffy has established herself within the space of the liminal: “I crept out to the edges of Ulster and we bought a little piece of the border. We built a house and I now have a studio just a mile and a half on the Southern side, and I live a mile and a half on the Northern side, so I kind of live in neither-here-nor-there land, which is a really interesting place to be as an artist. It’s very confusing and out of that springs the best imaginative possibilities for me.” Out of those imaginative possibilities have arrived big, bold ideas. The Titanic passenger liner was built in Belfast; the tragedy of its unfulfilled promise can be viewed as a metaphor for the long years the North lost to violence. In 2005, Duffy founded Thaw, a company set up to fund the towing of an iceberg from the Arctic to Belfast, where it would be moored outside the city and allowed to melt, in the process encouraging the shrinking of the deep, frozen divisions that still exist within Northern Irish society. Duffy has not yet found the means to drag her iceberg to Belfast, but since 2003 her paintings have been replete with the mythology of those hulking, frozen structures. She has created figurative images that appear trapped, encased in ice: Father Edward Daly, crouching, waving his handkerchief; a close-up of an arm, gesturing, holding a white handkerchief that may itself be an iceberg in miniature; in another painting, there is a Pieta, a mother holding a dying son, both emerging out of the bulk of an ice structure.

Duffy paints these images in greys and yellows, sometimes browns or greens, always muted. But in the middle, or in the distance, there is something, a speck of brightness, a blob, the white-grey of Father Daly’s handkerchief-iceberg, the light that draws your eye and that looks and feels like a breath of gulping air. If you thaw the frozenness, if you let it melt into the Irish Sea, then a space can open up, the iceberg no longer blocks your view and holds you in its frozen time. Behind you lies the city, with its plurality of people, before you the sky and the vastness of the ocean, deep and bold and cerulean blue. Duffy’s iceberg queen, a mammoth, back-turned Victoria, ascends into that blue, the blue of space, the blue of a possibility that allowed for an impossible friendship during the short time that former IRA member Martin McGuinness and the once-trenchant defender of Unionism, Reverend Ian Paisley, worked together in government in Northern Ireland. If you thaw the frozenness, a space opens up, and into that space walked Ian Paisley Jr., son of the good preacher, on various radio stations in January 2017, offering “humble and honest thanks” to Martin McGuinness on the occasion of the latter’s retirement.

In the North Atlantic, the largest iceberg on record was measured at 550 feet above sea level, the same height as a 55-story building; less tremendous ice structures can still reach more than 200 feet high. The Titanic, travelling at top speed on a calm night, crashed into an iceberg that was more than a mile long and 100 feet high and had been growing into its dense mass of packed ice since the time of Tutankhamun, although once such an iceberg drifts from the Arctic to the warmer waters of the North Atlantic, which this one had, it will normally melt in two or three years. To an impatient human eye, this melting will be imperceptible until it is close to completion. My daughter likes to play with ice cubes; she takes them from her glass of water and lays them on the table, where she can contemplate their light, their translucence. When she first started this game, I watched benignly; these days I place a tissue or napkin on the table to soak up the water that spreads out so suddenly as the cube, whole only a moment before, turns liquid before our eyes.

In Bosnia, where I’ve been doing research, the iceberg is still solid, a mountainous whole that blocks ethnicities from seeing across to each other. The Bosnian peace deal of 1995 somehow managed to avoid the formation of literal borders; instead, the populace has retreated into different enclaves across the country, Muslims stick with Muslims, Serbs with Serbs, and so forth. The saddest example of this is Sarajevo, which now sits within the Federation of Bosnia and Herzegovina, one of two political entities that compose Bosnia. Sarajevo’s population is now almost 90 percent Muslim, many of them newcomers since the ending of the war; former residents, most of whom will never go back, mourn the city that was once multi-ethnic and cosmopolitan. Everything has changed, they will tell you, shaking their heads; the city is totally different. There is still separatist agitation, particularly in the Republika Srpska, the political entity that sits closest to Serbia proper, whose nationalist leaders threaten to form their own tiny state, but the frozen iceberg contains more than that: it holds the pain of deceit, of mistrust, of horrors, of loss, of history and geography, of denial and defense. Most of my time in Bosnia was spent in the Republika Srpska, in the east of the country, where I found myself crossing and recrossing the border with Serbia. At each crossing, I encountered the checkpoints: the wait, the documents handed over, the computer clearance, the questions on occasion, the stamps. My husband, a photographer, made the crossing alone once and was held for two hours while guards went through his equipment, his backpack, his wallet. The determined absorption of different religions and cultures into the shape of a single Yugoslavia after the First and Second World Wars had its problems too, but those Bosnians old enough to remember the time when the many amalgamated into the one speak of it wistfully, softly, as if it were a fairy tale they used to tell themselves as children. Their iceberg was waiting, biding its time out at sea, before it floated inland to lodge itself forcibly among them. The disappearance of the border between North and South Ireland has not sunk our icebergs. But over the past sixteen years, until the schism that was June 23, 2016, we had found ways to float one with the other, moored and not, comfortable and not, settled and not.

Is it possible to hold two contrary ideas at the same time: that sense that merging is both terrifying and monumental, the knowledge that we are all different, but that we live within a common world, that we can choose to be something and not? Although Alice Notley wrote that the birth of her child had left her “undone” — “feel as if I will / never be, was never born” — she could still see the other side: “Of two poems one sentimental and one not / I choose both / of his birth and my painful unbirth I choose both.” She hung in the balance, remained midway, gave herself over to not settling in. The child that was a baby when I began my Maze project recently turned ten and is in process, in transition. She is a self that I am not, although that self, according to Deleuze and Guattari, is only “a threshold, a door, a becoming between two multiplicities.” Her identity is no more fixed than mine is, than mine ever was, for all that I have scrambled to chase it. “What is real,” write the philosophers, “is the becoming itself. The only way to get outside the dualisms is to be-between, to pass between, the intermezzo.” There is confusion, and much relief, in such malleable thinking.

I was wrong, of course, in the assumptions I made about the Maze story. After the initial openness that followed the prison’s closure in 2000, when the paramilitary prisoners were let out and the public allowed in, political wrangling slowly strangled the goodwill until the great gates swung shut again; they have stayed shut, more or less, ever since. When I talked to people in and around the prison about politics and the peace, they felt bitter and hurt and sad, and that was not easy to hear. But any hard edges of fear and certainty seemed also to have blurred into a resignation that meant we could at least stand outside of compartmentalizations and inside the fuzzy space that doubt tends to uncover.

It was almost always cold at the Maze; even during summer, the fog hung heavy over its vast flatness. When I was in need of warming, I would retreat to the small security hut at the entrance to the site, where a handful of guards took phone calls and processed visitors. What I recollect about these visits are the moments of recognition. One of the men, a gentle soul, English-accented, who had lost his wife too early and now lived a simple life of work and extended family, was the chatty type. I still remember how he once articulated my fear. “You dip your finger in a pool of water, swirl it about for a while, and when you take it out, the water will return to the way it was. Then it will be as if you never were.”

***

This essay first appeared in Brick, the biannual print journal of nonfiction based in Canada and read throughout the world. Our thanks to Rachel Andrews and the staff at Brick for allowing us to reprint this essay at Longreads.

 

What Ever Happened To the Truth?

Corbis Historical / Getty

Bridey Heing | Longreads | July 2018 | 7 minutes (1,841)

It isn’t often that a book review makes headlines, but legendary New York Times critic Michiko Kakutani did just that in 2016. Published about six weeks before the presidential election — one day after the first debate between Donald Trump and Hillary Clinton, when it seemed Clinton’s win was inevitable — Kakutani’s review of Hitler: Ascent, 1889-1939 by Volker Ullrich went viral when it was perceived as an attack on then-candidate Trump. The review itself was dominated by bullet-points drawing out ways in which Adolf Hitler went from a “‘Munich rabble-rouser’ — regarded by many as a self-obsessed ‘clown’ with a strangely ‘scattershot, impulsive style’” to Fuhrer in a country regarded as one of the poles of civilization. Trump’s name was nowhere in the review, but publications jumped on the apparent comparison. “Trump-Hitler comparison seen in New York Times book review,” said CNN; “This New York Times ‘Hitler’ book review sure reads like a thinly veiled Trump comparison,” from the Washington Post; “A review of a new Hitler biography is not so subtly all about Trump,” according to Vox. Even later reviews of the book itself were shaded by Kakutani’s seeming comparison.

Almost two years later, a subtle comparison between Hitler and now-President Trump feels incredibly tame and undeserving of such heavy scrutiny. But at the time, such comparisons weren’t altogether common in the mainstream; Trump seemed destined to lose and fade into whatever post-campaign activity he chose to channel his not-insignificant celebrity towards. Instead, of course, he won, and comparisons like Kakutani’s became far more common as it became clear that the presidency would not temper his stated goals and ambitions.

The review would prove to be one of Kakutani’s last in her position as the New York Times Book Critic, a role in which she proved a formidable force within the literary world. It was announced in July, 2017 that she would be stepping down after three and a half decades. Famously distant from the public eye, Kakutani’s seemingly abrupt departure so soon after causing a media firestorm left many questioning her next moves. Now, one year later, we have an answer: The Death of Truth: Notes on Falsehood in the Age of Trump. Read more…

The Far Right’s Fight Against Race-Conscious School Admissions

WASHINGTON, DC - OCTOBER 10: Attorney Bert Rein (L), speaks to the media while standing with plaintiff Abigail Noel Fisher (R), after the U.S. Supreme Court heard arguments in her caseon October 10, 2012 in Washington, DC. The high court heard oral arguments on Fisher V. University of Texas at Austin and are tasked with ruling on whether the university's consideration of race in admissions is constitutional. (Photo by Mark Wilson/Getty Images)

Late in the afternoon on July 3, the Department of Justice announced it was rescinding 24 documents issued by the Obama administration between 2011 and 2016. The documents  offered guidance to a range of constituencies, including homeowners, law enforcement, and employers. Some detailed employment protections for refugees and asylees; seven of the 24 discussed policies and Supreme Court rulings on race-conscious admissions practices in elementary, secondary, and post-secondary schools. In its statement, the DOJ called the guides “unnecessary, outdated, inconsistent with existing law, or otherwise improper.”

No immediate policy change will come from the documents’ removal. It’s more of a signal, a gesture in a direction, a statement about ideology. The Trump administration has already enacted several hard-line positions on immigration. And the Sessions-backed Justice Department has made a habit of signaling, by way of gesture, its opposition to affirmative action, and its belief that race-conscious policies, specifically, often amount to acts of discrimination.

***

The term “affirmative action” is ambiguous and has never been strictly defined. It’s a collection of notions, gestures, and ideas that existed before its present-day association with race. According to Smithsonian, the term was likely first used in the Depression-era Wagner Act. This legislation aimed to end harmful labor practices and encourage collective bargaining. It also mandated that employers found in violation “take affirmative action including reinstatement of employees with or without backpay” to prevent the continuation of harmful practices. The reinstatement and payment of dismissed employees were affirmative gestures that could be taken to right a wrong.

Nearly a decade later, in 1941, under pressure from organizer A. Philip Randolph, President Franklin D. Roosevelt issued Executive Order 8802 to prohibit race-based discrimination in the defense industries during the buildup to WWII. It is considered the first federal action to oppose racial discrimination since Reconstruction, and paved the way for President John F. Kennedy, who was the first to use “affirmative action” in association with race in Executive Order 10925. Kennedy’s order instructed government contractors to take “affirmative action to ensure that applicants are employed,” regardless of “race, creed, color, or national origin.” President Lyndon B. Johnson expanded the scope of Kennedy’s order to add religion when he issued Executive Order 11246 in 1965. Two years later, Johnson amended his own document to include sex on the list of protected attributes.

It was Republican president Richard Nixon who expanded the use of affirmative actions to ensure equal employment in all facets of government in 1969, when he issued Executive Order 11478. Nixon ran for office in 1968 on “law and order” and “tough on crime” messaging. He believed what he called “black capitalism” –- the idea of thriving black communities with high rates of employment and entrepreneurship — would ease the agitations of civil rights groups and end urban unrest. At the time, Nixon’s rhetoric won the support of a smattering of black cultural figures such as James Brown. “Black capitalism” was little more than a co-optation of some of the tenets of Black Power, which itself had come from a long-established line of conservative black political thought that emphasized economic empowerment and independence, self-determination and personal responsibility. In his version, Nixon envisioned only a slight role for the federal government; without the push of significant government investment, the policies and programs he created didn’t result in sweeping change. Still, shadows of Nixon’s thinking on black economics endured: They’re present in multiple speeches Obama made to black audiences during his presidency; Jay Z’s raps about the transformative, generational effects of his wealth; Kanye West’s TMZ and Twitter rants. Also, the backlash Nixon faced is remarkably similar in tone and content to today’s challenges to affirmative action, which typically involve a white person’s complaints about the incremental gains made by members of a previously disadvantaged group:

In 1969 Section 8(a) of the Small Business Act authorized the SBA to manage a program to coordinate government agencies in allocating a certain number of contracts to minority small businesses—referred to as procurements or contract “set-asides.” Daniel Moynihan, author of the controversial Moynihan Report, helped shape the program. By 1971 the SBA had allocated $66 million in federal contracts to minority firms, making it the most robust federal aid to minority businesses. Still, the total contracts given to minority firms amounted to only .1 percent of the $76 billion in total federal government contracts that year.

Yet even these miniscule minority set-asides immediately faced backlash from blue-collar workers, white construction firms, and conservatives, who called them “preferential treatment” for minorities. Ironically, multiple studies revealed that 20 percent of these already meager set-asides ended up going to white-owned firms.

***

A sense of lost advantage and power seems to animate both historical and recent challenges to race-based policies and practices. In Regents of University of California v. Bakke (1978) the first affirmative action case the Supreme Court ruled on, Allan Bakke, a white University of California at Davis medical school applicant, sued the school after being twice denied admission. The school had created a system to set aside a certain number of spaces for students from marginalized groups. The Court decided practices that relied on quota systems were unconstitutional, but it upheld the use of race in admissions decisions as long as it was among a host of other factors. Rulings in subsequent cases, such as Grutter v. Bollinger (2003) and most recently, Fisher v. University of Texas (2016) supported the use of race in admissions and reiterated the federal government’s interest in the diversity of the nation’s institutions. In the most-recent case, now-retired Justice Anthony Kennedy provided the Court’s swing vote.

Plaintiffs in affirmative action challenges tend to argue race-conscious admissions policies violate rights granted by the Fourteenth Amendment, especially its clause guaranteeing “equal protection of the laws.” Ratified 150 years ago last week, the Fourteenth Amendment established birthright citizenship and defined citizenship’s parameters. Its ideas originated in the years leading up to Reconstruction, during “colored conventions” held among African American leaders and activists,  and form the underpinnings of Brown v. Board Education (1954) and some provisions of the Civil Rights Act of 1964.

One of the most prominent opponents of affirmative action, Edward Blum, a fellow at the American Enterprise Institute, actively seeks and recruits aggrieved plaintiffs and attorneys to challenge race-based policies in school admissions and voting practices. Blum was the force behind the complaint of Abagail Fisher, the white student at the center of Fisher v. University of Texas. According to the New York Times:

In the Texas affirmative action case, he told a friend that he was looking for a white applicant to the University of Texas at Austin, his own alma mater, to challenge its admissions criteria. The friend passed the word to his daughter, Abigail Fisher. About six months later, the university rejected Ms. Fisher’s application.

“I immediately said, ‘Hey, can we call Edward?’” she recalled in an interview.

The case went to the Supreme Court twice, and though Ms. Fisher was portrayed as a less than stellar student, vilified as supporting a racist agenda, and ultimately lost, she said she still believed in Mr. Blum. “I think we started a conversation,” she said. “Edward obviously is not going to just lie down and play dead.”

Blum’s first lawsuit came about after he lost a Congressional election in Houston because, he felt, the boundaries of his district were drawn solely along racial lines. He is now behind lawsuits against Harvard University and the University of North Carolina at Chapel Hill, which allege the schools’ admissions policies discriminate against Asian American applicants. It is interesting and bold to use white women and Asian American students to dismantle programs meant to address America’s legacy of discrimination. Both groups have benefited significantly from Reconstruction and Civil Rights-era policies and legislation. Do Blum, Sessions, and their supporters believe race-based policies are irrelevant, illegal, or improper because for many, they’ve worked? I sense something more nefarious at play, such as a mounting sense of loss and growing resentment that the demographic shifts in our country also mean inevitable shifts in who holds power.

The Sessions-helmed Justice Department’s signals and the nomination of Judge Brett Kavanaugh to the high court, have, I’m sure, heartened activists like Blum. For the Nation, Eric Foner wrote about how the Fourteenth amendment’s ambiguity is what allows it to be used in a way that is so at odds with the spirit of its origins. It is that ambiguity, he says, that will allow, someday, in a different political climate, for another era of correction.

Sources and further reading: