Search Results for: LA Weekly

The Cowboy Image and the Growth of Western Music

Photo by Michael Ochs Archives/Getty Images

Bill C. Malone and Tracey Laird | Country Music USA | University of Texas Press | June 2018 | 25 minutes (6,531 words)

The emergence of the western image in country music was probably inevitable. Long before the process of commercialization began, the cowboy had been the object of unparalleled romantic adulation and interest. Given the pejorative connotations that clung to farming and rural life, the adoption of cowboy clothing and western themes was a logical step for the country singer.

The increased emphasis on western themes and attitudes appeared unsurprisingly in the westernmost southern states ─ Louisiana, Oklahoma, Texas ─ and in California. In these areas, country music assumed forms differing from those in the more easterly southern states. Oklahoma, Louisiana, and Texas, although southern in traditional orientation, embodied significantly different elements. All three were touched by the oil boom of the early twentieth century, and each possessed population groups that stood apart culturally while simultaneously influencing the dominant “Anglo” element of the state. Oklahoma and Texas were settled, for the most part, by former residents of the older southern states, who had brought with them their values, traditions, and institutions. Louisiana, on the other hand, can be perceived as a land of at least three great cultures: a Roman Catholic, “Latin” culture in the South; an “Anglo,” Protestant culture in the north; and an African American culture whose influence could be felt throughout the state. Immigrants brought slaves and the cotton culture to all parts of the Southwest, making Texas and Louisiana parts of the southern economic and political orbit. They also transported their evangelical Protestantism to southwestern soil and brought with them many features of their folk heritage. Some of the old British ballads survived the westward migration, although they had lost many of their former characteristics. In some Texas communities, such as those found in the Big Thicket, a heavily forested area in the eastern part of the state, old ballads and old styles of singing endured well into the twentieth century. Many of the East Texas communities were, and remain, replicas of the older southern environment. And, in many of them, folk traditions died slowly.

Listen to music writer Will Hermes’ interview with Bill Malone and Tracey Laird on the Longreads Podcast here (read as transcript).

Texas folk music, then, was basically southern derived. Texas rural musicians used instruments common to the rest of the South, sang in styles similar to those of other rural southerners, frequently attended house parties where old-time fiddlers held sway, and learned to read music at the shape-note singing schools. But despite its close cultural affiliation with the South, Texas had a culture all its own ─ a culture produced by the mingling of diverse ethnic strains: southern “Anglo,” black, German and Central European (especially prevalent throughout the southern part of the state), Mexican, and Louisiana Cajun (in the area extending from Beaumont to Houston). A passion for dancing was common among all these groups, and in this heterogeneous society, musical styles and songs flowed freely from one group to another, modifying the old southern rural styles. While rural music was prevalent and pervasive, it differed substantially from that produced in the Southeast or in the Deep South.

The discovery of oil at Spindletop, near Beaumont, in 1901 was the first of a series of finds in southeastern Texas, southwestern Louisiana, Oklahoma, and Arkansas in the years extending up through World War I. The discovery of the great East Texas oil field in the early 1930s, along with the rapid industrialization that began during World War II, further set Texas apart from the other southern states. While these factors contributed to Texas’s uniqueness, they are probably less important than the fact that it was also part of the West. In fact, to most Americans, Texas was and is the West. And this West was a glorious land peopled by cowboys.

The romantic concept of the West, shared by most Americans, has a history virtually as old as the nation itself. James Fenimore Cooper’s early novels describing the restorative qualities of the frontier were not substantially different, nor less romantic, than the themes emphasized later in Bret Harte’s stories, in the western “dime novels,” or in such books as Owen Wister’s The Virginian. Thus, the cowboy and the West had been bathed in romance long before Hollywood and the television industry began their exploitations of the theme. The American people also had long demonstrated a general interest in the songs of the cowboy ─ beginning with Nathan Howard Thorp’s Songs of the Cowboys, 1908, and John A. Lomax’s Cowboy Songs and Other Frontier Ballads, 1910 (as a matter of fact, as early as 1907, when “San Antonio” appeared, Tin Pan Alley tunesmiths had experimented with “cowboy” themes). Although a few concert-musicians such as Oscar Fox (from Burnet, Texas) and David Guion (from Ballinger, Texas) made classical arrangements of a few cowboy songs, the western theme did not make any significant impact on American music until the 1930s. Guion’s version of “Home on the Range,” first performed in 1930 in a New York play called “Prairie Echoes,” became the most popular arrangement of the song and was said, perhaps apocryphally, to be President Franklin Roosevelt’s favorite song. Such songs became so widely circulated in the 1930s that even Tin Pan Alley reverberated with the melodies of the range. The farther Americans became removed from the cowboy past, the more intense became their interest in cowboy songs and lore. Hillbilly singers and musicians did much to implant the romantic cowboy image in the minds of their American audiences.

Before the 1930s, a few musicians recorded songs that genuinely reflected the cowboy heritage. The concert singer Bentley Ball ─ who did many programs of patriotic and traditional songs, many of them in colleges ─ recorded “The Dying Cowboy” and “Jesse James” for Columbia in 1919. Charles Nabell, in November 1924, recorded some cowboy songs for Okeh, along with other types of traditional material. Several of the early cowboy singers came from Texas, and their songs, for the most part, reflected genuine cowboy experience. Carl Sprague, for example, may have done most to generate an immediate interest in the recorded songs of the cowboy. He grew up on a South Texas ranch near Alvin where he learned many of the songs (most of them from his cowboy uncle) that he later recorded for Victor. His 1925 recordings of cowboy songs — topped off by the immensely popular “When the Work’s All Done This Fall” — mark him as one of America’s first singing cowboys. While attending Texas A&M, Vernon Dalhart’s success as a singer of traditional songs convinced Sprague that a similar market for cowboy singers might exist. He traveled to New York and had a successful audition with Victor Records; his earliest recordings had a sound very similar to that of Dalhart, including guitar and studio violin. Singing, however, was never more than a hobby with Sprague, and aside from his recordings, he made few commercial appearances. For many years he was on the coaching staff at Texas A&M, and, in addition, he attained the rank of major in the United States Army.

The romantic concept of the West, shared by most Americans, has a history virtually as old as the nation itself.

Jules Verne Allen, on the other hand, had actually experienced the rugged life of a working cowboy before he embarked on his career as a radio singer. Born in Waxahachie, Texas, Allen began working cattle in Jack County, west of Fort Worth, at the age of ten. From 1893 to 1907 he worked as a rough string rider and bronco buster from the Rio Grande to the Montana line. Unlike Sprague, he used cowboy music as the basis for a professional career. During the 1920s and 1930s, Allen sang over numerous radio stations, including WOAI in San Antonio, where he performed as “Longhorn Luke.” Like most of the pioneer recording performers of the 1920s, Allen and Sprague drew most of their material from turn-of-the-century cowboy life, although some of their songs were learned directly from the Lomax collection.

Other cowboy singers of the early commercial period varied widely in the amount of actual range experience they possessed. The Cartwright Brothers (Bernard and Jack) grew up in Boerne, Texas, directly on the route of “the long drive” that proceeded on to Kansas. Essentially a fiddle band, the Cartwrights performed a variety of songs. Their version of “Texas Rangers,” however ─ marked by Bernard’s haunting fiddle ─ is one of the greatest performances of a cowboy song heard on early commercial records. Carmen William “Curley” Fletcher, from California, was a rodeo performer and itinerant hawker of songs long before he made any commercial recordings. His greatest claim to fame came through his writing in 1915 of the poem that became the basis for “The Strawberry Roan,” which he sold on broadside sheets. The song became one of the most popular western numbers, performed usually with a chorus added by the California radio singers Fred Howard and Nat Vincent. At least a couple of the pioneer cowboy singers, Goebel Reeves and Harry McClintock, were southerners whose wanderlust drew them west, where they worked at a wide variety of occupations. Both men, for example, spent some time in the famous radical labor union the Industrial Workers of the World (IWW, or Wobblies).

Our knowledge of the otherwise shadowy figure of Goebel Reeves comes from the pioneering research done by Fred Hoeptner. Known as “the Texas Drifter,” Reeves was born in Sherman, Texas, in 1899. Before his death in California in 1959, he had enjoyed a varied career that led him across the United States and around the world. Although he came from a respectable middle-class family (his father served in the Texas legislature), Reeves deliberately chose the life of a hobo. During the course of his wanderings, he enlisted in the army, saw front-line service in World War I, worked as a merchant seaman, became active in the IWW, toured the vaudeville circuit, performed on radio, and recorded under several names for such companies as Okeh and Brunswick. In his recording career as a singer and yodeler ─ he claimed to have taught Jimmie Rodgers the yodeling style in the early 1920s while living in New Orleans ─ Reeves introduced some of the most interesting examples of both cowboy and hobo songs found in American music. These included the well-known “Hobo’s Lullaby” (which he claimed to have written), “The Hobo and the Cop,” “Railroad Boomer,” and the cowboy songs “Bright Sherman Valley” and “The Cowboy’s Prayer.”

Harry McClintock was as well traveled as Reeves, having also been a merchant seaman, a soldier, and a hobo. Born in Knoxville, Tennessee, he roamed widely throughout the United States and became a member of the IWW in the early twentieth century. Because of his musical talents, McClintock was a welcome addition to the Wobblies, who had a well-known fondness for singing and whose Little Red Songbook became virtually the bible for labor/protest singers in America. McClintock’s claim that he wrote “Hallelujah, I’m a Bum” and “Big Rock Candy Mountain,” two of the world’s most famous hobo songs, has never been seriously challenged. Once he settled down from his wanderings, McClintock began a career as a radio cowboy singer as early as 1925 on KFRC in San Francisco. “Haywire Mac,” as he was often called, also recorded for Victor from 1927 to 1931. Along with superbly performed cowboy songs such as “Sam Bass,” “Jesse James,” and “Texas Rangers,” McClintock’s labor songs make him one of the important progenitors of western music.

John White and Otto Gray contributed to the shaping of western music by presenting it widely to a national audience. White was an unlikely “westerner,” hailing from Washington, DC. However, he was the first person to introduce cowboy songs on radio to a New York audience (on NBC from 1927 to 1936). He also recorded cowboy songs, as well as hillbilly material, from 1929 to 1931, under several pseudonyms including “the Lonesome Cowboy.” White specialized in the history of cowboy songs, and over the years he did more than any other person to describe the origins of the ballads, and he dispelled much of the romantic claptrap that had gathered around them.

Otto Gray, a prosperous rancher from near Stillwater, Oklahoma, pioneered in the commercialization of cowboy music. In about 1923, he assumed the leadership of a string band that earlier had been composed of real cowboys ─ the McGinty Cowboys (named for Billy McGinty, an Oklahoma rodeo performer). Gray’s group had the distinction of being one of the few country groups publicized in Billboard, although Gray paid for most of the advertising. From 1928 to 1932, Gray and his Oklahoma Cowboys made a tour of radio stations throughout the country and performed on the northeastern RKO vaudeville circuit. Momie Gray (Otto’s wife) was the featured singer of the organization, specializing in sentimental songs. The Oklahoma Cowboys were a highly professional group that possessed most of the characteristics of slick show-business organizations. A special publicity man traveled in advance of the group, and appearances on radio stations provided further exposure. Two agencies, the Weber-Simon Agency in New York and the William Jacobs Agency in Chicago, handled the group’s RKO bookings. The Gray performers, dressed in plain, western-style clothing, traveled in Gray’s $20,000 custom-built automobile, which was wired for sound reproduction and had a radio receiver and transmitter.

If Otto Gray contributed significantly to the commercialization of “western” music, Jimmie Rodgers played an equally important role in fusing it with country music. As discussed earlier, Rodgers spent the last few years of his life in Texas and conducted many of his most successful tours there. He took great pride in the Texas heritage and the romantic cowboy past. The modern concepts of the “singing cowboy” and of “western” music may very well date back directly to Rodgers.

Scores of singers who modeled themselves after Jimmie Rodgers emerged in the 1930s, and most of them gave themselves “cowboy” titles and dressed in western attire. Young Hank Snow, for example, in far-off Nova Scotia, dressed in cowboy regalia and called himself “the Yodeling Ranger.” In even more remote Australia, Robert William Lane performed under the name of Tex Morton, described himself as “the Boundary Rider,” and sang cowboy songs with a bizarre, trilling yodel about both the Australian bush and the Texas Plains. Others, like Ernest Tubb, included few cowboy songs in their repertories but wore cowboy boots and ten-gallon hats. Since the western attraction was irresistible, even young hillbilly singers from the Deep South or from the southeastern mountains, whose associations with cowboys came only through story and song, embraced the western image and imagined themselves “way out west in Texas for the roundup in the spring.”

Perhaps because of Rodgers’s close association with Texas, many of the successful Texas hillbilly performers ─ Ernest Tubb, Lefty Frizzell, Floyd Tillman, Bob Wills, Tommy Duncan ─ credited Jimmie Rodgers as their inspiration. One of the most important of these individuals, and the one who completed the “romantic westernizing” process begun by Rodgers, was Orvon Gene Autry. Autry owed most of his initial success to the fact that he could perform Rodgers’s repertory in Rodgers’s yodeling style. Autry was born on a horse farm near Tioga, Texas, on September 29, 1907, but moved to Oklahoma with his parents while in his teens. Although his father was a horse trader, one finds that Gene experienced little of the cattle ranch life that his promotional material later stressed. At any rate, he left the “ranching” life as quickly as he could, working as a railroad telegrapher and singing at every opportunity.

According to a much-repeated story, confirmed by Autry himself, Will Rogers inspired his decision to become a professional musician. One day in 1927 the great humorist came to Chelsea, Oklahoma, where Autry was working as a telegrapher for the St. Louis and Frisco Railroad, heard the young man singing and strumming his guitar, and strongly encouraged him to go to New York and become a professional. Autry’s first trip to the big city in 1927 was unsuccessful, but he returned to Tulsa and got a job on KVOO as “the Oklahoma Yodeling Cowboy.” Returning to New York in 1929, he made his first records for Victor, accompanied by the Marvin Brothers, Johnny and Frankie. In December of the same year, Autry began a crucial association with Arthur Satherley, who recorded him for the American Record Company (ARC), producer of records for chain stores and for Sears. It was through the association with the Sears Conqueror label that Autry made it to WLS and the National Barn Dance.

In Chicago after 1931, Autry was an immediate success. His appearances on the Barn Dance and on his own radio program, Conqueror Record Time, made him one of the most popular performers in WLS history. His records, released on Sears labels, were those most prominently displayed in the Sears-Roebuck catalogue. As a result of his growing popularity, a number of Gene Autry songbooks and guitar instruction books began to appear in the early 1930s. An ad for a Gene Autry “Roundup” Guitar, priced at $9.95, reminded the reader that Autry had become a famous performer “simply because he learned to play a guitar while on the ranch.” Autry’s promotional mentors, Art Satherley and Ann Williams of the WLS production staff, capitalized on the “western” motif and advertised him as a singing cowboy long before the bulk of his recorded repertory came to include western numbers.

With Autry ensconced as a singing movie cowboy, hillbilly music now had a new medium through which to popularize itself.

In his early years as a professional singer, and on through the WLS period from 1931 to 1934, Autry remained a hillbilly singer, only rarely singing anything of a western variety. In both song selection and in style of performance, he revealed his indebtedness to the southern rural tradition. His Jimmie Rodgers imitations were among the best in country music, and his own “compositions” (written or cowritten with people like Jimmie Long) included such songs as “A Gangster’s Warning,” “A Hillbilly Wedding in June,” “Gosh, I Miss You All the Time,” and “My Old Pal of Yesterday.” In 1931, he recorded one of the biggest-selling hits in hillbilly music’s then-short history, “That Silver Haired Daddy of Mine,” recorded as a duet with the song’s co-composer, Jimmie Long. Autry’s many and varied recorded selections even included at least one labor song: “The Death of Mother Jones,” recorded on at least seven labels, which applauded the life of the famous and radical labor leader. While the song seemed rather remote from the type one would expect from a cowboy singer, it nevertheless reflected the passion for social and economic justice that many people felt during these Depression years.

Autry’s success on the Chicago radio stations and on record labels gained him in 1934 the position that made him the best-known cowboy in the United States and one of the most famous hillbilly singers. In that year, he arrived in Hollywood and began his career as the “Nation’s Number One Singing Cowboy.” Beginning with a small part in Ken Maynard’s In Old Santa Fe, he then starred for thirteen episodes in a strange cowboy/science-fiction serial called The Phantom Empire. Autry went on to a featured role in 1935 in Tumbling Tumbleweeds, a film that also included his old sidekick from Chicago days, Lester Alvin “Smiley” Burnette. In the following decades, he made more than ninety movies for Republic, Columbia, and Mascot, eighty-one of which included the multitalented Burnette, who usually played a bumbling character, Frog Millhouse. While becoming one of the most popular and wealthy actors in Hollywood, Autry also created the stereotype of the heroic cowboy who was equally adept with gun and guitar. Autry was not the first individual to sing in a western movie ─ Ken Maynard had done so as early as 1930 ─ but he was the first to institutionalize the phenomenon. With Autry ensconced as a singing movie cowboy, hillbilly music now had a new medium through which to popularize itself. The silver screen further romanticized the cowboy and helped shape the public idea of western music.

After signing his Hollywood contract, Autry made a radical shift in his repertory from “country” themes to “western” motifs. Instead of singing songs about the mountains, he came increasingly to perform songs with such titles as “Ridin’ Down the Canyon,” “The Round-up in Cheyenne,” and “Empty Cot in the Bunkhouse.” Both in Autry’s singing and in the instrumentation that accompanied him, one hears a distinctly measurable change in the records he made from 1929 to 1939. As the one-time hillbilly singer reached out to a larger audience, he smoothed out his presentation of material with a lower vocal pitch, well-rounded tones, and honey-coated articulation. Instrumentally, Autry’s sound exhibited a similar evolution, particularly after the violinist Carl Cotner became his musical director. Soft guitars, muted violins, a melodious but unobtrusive steel guitar, an accordion, and occasionally even horns could be heard as background instrumentation, as he and his directors sought a sound that would give no offense to America’s broad urban middle class. Whatever vocal sound was featured, however, Autry demonstrated a mastery of it. No country singer has ever shown more versatility.

Autry’s popularity inspired other movie companies to present their own versions of the singing cowboy. In searching for likely candidates, the companies usually delved into the ranks of country music, acquiring acts that had already established themselves on hillbilly radio shows or on record labels. Following Smiley Burnette, the Light Crust Doughboys became the first country group to join Autry in a movie (Oh, Susanna!). Some Autry sidemen went on to become important entertainment personalities in their own right. Johnny Bond, Jimmy Wakely, and Dick Reinhart, for example, came to Hollywood in 1940 (as the Jimmy Wakely Trio) and joined Autry’s Melody Ranch radio show in September of that year. Reinhart became one of the early exponents of the honky-tonk style, with songs like “Fort Worth Jail” and “Truck Driver’s Coffee Stop.” Wakely eventually starred in many movies of his own, became one of country music’s smoothest singers, and made several seminal recordings, such as “One Has My Name (The Other Has My Heart)” (one of the first successful “cheating” songs in country music). Bond remained on the Melody Ranch program until it ended in 1956, playing the role of a comic sidekick and opening the show each Sunday with the bass guitar run introduction to “Back in the Saddle Again.” Bond also became one of country music’s greatest songwriters, creating such songs as “Cimarron” (a song about a small river in Oklahoma, and performed by all western groups), “I’ll Step Aside,” “Old Love Letters,” and “I Wonder Where You Are Tonight” (now a standard in both bluegrass and mainstream country music).

A long line of hillbilly singers made only occasional appearances in western movies, usually as supporting actors for such leading cowboy stars as Charles Starrett and Johnny Mack Brown. The Sons of the Pioneers appeared in numerous movies, while Bob Wills and his Texas Playboys were in about eight. A few singers, such as Ernest Tubb, Jimmie Davis, and Bill Callahan, made only rare appearances.

Other singers, however, became leading men and posed at least modest challenges to Autry’s dominance. Atlanta-born Ray Whitley, the writer of “Back in the Saddle Again” and the designer of one of country music’s most popular guitars, the Gibson SJ-200, became a movie star in 1936 after an earlier successful career in New York as a cowboy singer. Tex Ritter also began his movie career in 1936, and, in the fifty-six movies that he eventually made, he became the most believable of all the singing cowboys. The most successful challenge to Autry, though, came from Roy Rogers, who signed with Republic in 1937. His visibility in American public life would last, because of television, well into the 1960s. The singing cowboy genre also persisted in American movies on into the 1950s, with Arizona-born Rex Allen being its chief exponent after 1949. In many ways, this last singing cowboy was the best singer of them all. Allen’s rich voice ranged from a deep bass to a sweeping tenor ─ a sound that almost no other country singer could equal.

Largely as a result of Hollywood exploitation, the concept of “western music” became fixed in the public mind. After the heyday of Gene Autry, the term “western” came to be applied even to southern rural music by an increasing number of people, especially by those who were ashamed to use the pejorative term “hillbilly.” Not only did the public accept the projection, but even most hillbilly singers became fascinated with the western image and eventually came to believe their own symbols. Autry was the first of a long line of country singers who clothed themselves in tailored cowboy attire; in the following decades, the costuming became increasingly elaborate and gaudy, with the brightly colored, bespangled, and rhinestone-laden uniforms created by Nudie the Tailor (Nudie Cohn, born Nuta Kotlyarenko in the Ukraine in 1902) in Los Angeles being the most favored fare. Eventually, most country performers, whether they hailed from Virginia or Mississippi, adopted cowboy regalia–usually of the gaudy, dude cowboy variety.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Along with the clothing, country bands and singers ─ particularly in the Southwest and on the West Coast ─ adopted cowboy titles. Singers with names like Tex, Slim, Hank, Red River Dave, the Utah Cowboy, and Patsy Montana, and groups with such titles as the Cowboy Ramblers, Riders of the Purple Sage, Radio Cowboys, Swift Jewel Cowboys, Lone Star Cowboys, and Girls of the Golden West (Dolly and Millie Good) abounded on radio stations (and record labels) all over the nation. Radio and record promoters, of course, were very much alive to the appeal of the western myth, and they often encouraged musicians to adopt appropriate western monikers. Millie and Dolly Good, for example, were farm girls from Illinois who sang and yodeled in sweet, close harmony. Their agent advised them to dress like cowgirls, gave them the romantic title Girls of the Golden West, and then, after scanning the map of western Texas, attached to their promotional literature the statement that they were born in Muleshoe, Texas. The Girls very carefully preserved this fiction to the end of their performing career.

Patsy Montana’s career was similarly shaped by romantic conceptions of the West. She was a singer and a fiddler from Arkansas named Rubye Blevins, but on the West Coast in the early 1930s, Stuart Hamblen renamed her Patsy Montana, and she thereafter cultivated the performing image of the cowgirl. Although much of her career saw her appearing as a “girl singer” with such groups as the Prairie Ramblers, Patsy made dramatic history in 1935 when “I Want to Be a Cowboy’s Sweetheart” became the first huge hit by a woman country singer and a virtuoso yodeling piece that still influences the style of women singers (Austin country-rock singer Marcia Ball, for example, made the song and yodel standard parts of her repertory in the late 1970s).

Many of the “western” entertainers performed cowboy songs, usually highly romanticized, but more often their titles and attire were the only ties they had with the “West.” Several musicians, however, stayed rather close to the cowboy repertory. Some of them had been performing long before Gene Autry achieved Hollywood fame, and many of them, such as “Haywire Mac” McClintock and the Crockett Family (John H. “Dad” Crockett and his five sons, originally from West Virginia), had performed on California radio stations since at least 1925. Other early California groups included Len Nash and his Original Country Boys, broadcasting from KFWB, Hollywood, as early as March 1926; Sheriff Loyal Underwood’s Arizona Wranglers; Charlie Marshall and his Mavericks; and perhaps the most important (and certainly the most interesting), the Beverly Hillbillies.

Largely as a result of Hollywood exploitation, the concept of “western music” became fixed in the public mind.

The Beverly Hillbillies were the brainchild of Glen Rice, station manager at KMPC in Los Angeles. Reversing the trend toward adoption of western names during the 1930s, Rice used the eastern moniker Hillbillies for the group of western musicians that he assembled around the accordion player Leo Mannes (renamed Zeke Manners) and conducted a ballyhoo campaign alleging that a group of strange and primitive musicians had been unearthed in the hills of Beverly. The band made its debut on KMPC on April 6, 1930, and remained a popular feature throughout the decade. Over the years the Hillbillies included several fine musicians, such as Manners, who had no background in country music but had been attracted to California because of the lure of Hollywood. A few Hillbillies were genuine country boys, such as the sky-high yodeler Elton Britt (James Britt Baker), who came from Arkansas in 1930, and Stuart Hamblen, who came from Texas in the same year. Britt went on to become one of country music’s most gifted yodelers (virtually the last of that once-hardy breed) and a leading soloist during the 1940s. Hamblen, the son of a Methodist minister in Abilene, Texas, was a fixture on West Coast radio from 1930 to the 1950s. He hosted his own shows in Hollywood after 1931, boosted the careers of other performers, wrote many of the most successful songs of the decade (including “My Mary,” “Texas Plains,” “Golden River,” and “My Brown-Eyed Texas Rose”), was the first country performer signed by Decca in 1934, and became sufficiently known to become a candidate for Congress in 1938.

The western group that ultimately became the most famous, and the most frequently emulated, was the Sons of the Pioneers. They sang virtually every type of country song and even ventured into popular music, but the majority of their melodies dealt with western themes. Perhaps more than any other group, they preserved a western repertory and exploited the romantic cowboy image. More “western” stylistically than any other group, they were among the least western in terms of origin. Bob Nolan (Robert Clarence Nobles) was born in New Brunswick, Canada, but he moved with his parents to Tucson at the age of fourteen. In Tucson he found himself fascinated with the desert, a feeling that never left him and eventually inspired some of country music’s greatest songs, such as “Cool Water,” “Tumbling Tumbleweeds,” and “At the Rainbow’s End.” Tim Spencer, also an outstanding songwriter, was born in Missouri but grew up in Oklahoma, Texas, and New Mexico. Roy Rogers came from southern Ohio.

The three musicians came to California in the early 1930s and soon fell into a pattern common to most country singers during the decade, moving from group to group before they formed their own organization. Roy Rogers, the prime organizer of the trio, was born Leonard Slye in Cincinnati, on November 5, 1911, but grew up on a small farm near Portsmouth, in southern Ohio. Here he garnered his earliest musical training from his Kentucky-born mother and his mandolin-and-guitar-playing father. In 1931 he and his father moved to Tulare, California, and worked as migratory fruit pickers. In the following three years, beginning with a duo called the Slye Brothers (Leonard and a cousin), he worked with several western-style groups until the Pioneer Trio was formed in 1933. Renamed the Sons of the Pioneers the following year, the trio soon became noted for their smooth, inventive harmonies and yodeling, and for the finely crafted songs that Nolan and Spencer created. They became so famous for their harmony that their instrumental accompaniment is often forgotten. Two extraordinarily talented brothers from Llano, Texas, Hugh and Karl Farr, joined them in 1934 and 1935. The Farrs were jazz-influenced country musicians whose progressive styles were sometimes obscured by the vocal emphasis of the Pioneers. Hugh Farr, who also sang a low-down bass with the group, was one of the hottest fiddlers of the period, and his brother, Karl, was a master of both the rhythm and single-string styles of guitar.

The Pioneers won extensive popularity on the West Coast with an early-morning radio program on KFWB in Hollywood, but 1936 proved to be their banner year. By this time their radio transcriptions were being widely circulated, and the group became a featured act, along with Will Rogers, at the Texas Centennial in Dallas. Leonard Slye left the group in 1937 after signing a movie contract with Republic Studios. At this point he changed his name, first to Dick Weston, and later to Roy Rogers. His performances after this time were made on an individual basis, and he eventually rivaled Gene Autry as America’s most popular singing cowboy (Rogers was also one of country music’s finest yodelers). He was replaced in the Sons of the Pioneers by Lloyd Perryman from Ruth, Arkansas, whose natural tenor was the first the group had ever had, and who gave them an even closer harmony than they had earlier possessed. The Sons of the Pioneers underwent numerous personnel changes after 1937 but have never disbanded. Their songs moved into the repertories of country singers everywhere, and their style of harmony was widely copied, most effectively by Foy Willing (originally Willingham) and the Riders of the Purple Sage, who appeared with Monte Hale and Roy Rogers in Republic Pictures from 1942 to 1952.

The flourishing singing cowboy industry inspired the emergence of songwriters, including two of country music’s finest ─ Fred Rose and Cindy Walker ─ who made their debuts as country composers in the 1940s when they wrote songs for movies (Rose for Autry, Walker for Bob Wills). The interest in western music in the 1930s, however, was not confined to country performers and their supporters. Writers from Tin Pan Alley also reacted to the western craze, and the entire nation was soon humming western-style tunes such as “Gold Mine in the Sky,” “There’s a Home in Wyoming,” and “I’m an Old Cowhand.” Some of these tunes were written by easterners who had never been near a cow, but the Happy Chappies at least lived in California in the midst of the Hollywood industry. The Chappies were a pop-singing duo named Nat Vincent and Fred Howard who wrote or arranged such songs as “When the Bloom Is on the Sage,” “Mellow Mountain Moon,” “My Pretty Quadroon,” and “Strawberry Roan” (the last a musical adaptation of Curley Fletcher’s earlier poem). The most successful of the western-oriented popular songwriters was a Bostonian, William J. (Billy) Hill. Hill’s birth and musical training gave no indication of his future success as a western songwriter. Born in Boston in 1899, he studied violin at the New England Conservatory of Music and performed for a short time with the Boston Symphony Orchestra. In 1916 he traveled west, riding the rails and working at odd jobs until he had seen most of the western states. He returned to New York in the late 1920s after becoming thoroughly acquainted with western life ─ including everything from camp cooking to cowpunching. In New York he worked as a doorman at a fashionable hotel and composed songs occasionally. Over the years his compositions ranged from popular melodies like “The Glory of Love” to hillbilly songs like “They Cut Down the Old Pine Tree” and “The Old Spinning Wheel.” His chief success, however, came with western-style songs like “Call of the Canyon,” which were distinguished for their beautiful melodies and for rhythms that suggested the gait of a horse. He experienced his most spectacular success in 1933 with “The Last Roundup,” the song that really awakened the general public to the romantic West while becoming the most popular tune in the country. Performed by both hillbilly and popular groups, its appeal may have stimulated a greater interest in the more “authentic” country and western material and ensured a greater national following for country music.

Most of the western bands in California and the Southwest used Billy Hill’s material, but his New York songwriting ventures were directed primarily at big-city popular-music audiences. Although country music has always encountered its coolest reception in the Northeast, particularly in the city of New York, country-style entertainers have always achieved some prominence there on local radio stations. Ethel Park Richardson, for example, did much to educate New Yorkers about the beauties of folk culture between 1933 and 1935 with her weekly dramatizations on WOR and the NBC Network. Each week she was assisted by such singers as Frank Luther, Carson Robison, and Tex Ritter as she dramatized a famous folk song. Luther and Robison had been in New York since the 1920s, but Ritter was one of several cowboy singers who kept New Yorkers range conscious during the mid-1930s. Others included Texas Jim Robertson, a deep-bass singer from Batesville, Texas; Zeke Manners and Elton Britt, who had moved from California; Dwight Butcher, a Jimmie Rodgers disciple from Tennessee; Ray Whitley, who sang regularly at the Stork Club and on WMCA; and Wilf Carter, the Nova Scotia yodeler who performed over CBS as Montana Slim.

The most singular of all the cowboy singers in New York, however, was Woodward Maurice “Tex” Ritter. Born in Murvaul, in deep East Texas, January 12, 1905, Ritter grew up far removed from the scene of much cowboy activity. He attended the University of Texas for five years (singing in the university glee club under the direction of Oscar Fox) and then went to Northwestern Law School for one year. Throughout his youth he had collected western and mountain songs, and therefore had a storehouse of interesting songs when he began singing on KPRC in Houston in 1929. In 1930, he joined a musical troupe on a series of one-night stands through the South and Midwest. By 1931, he had gone to New York, where he joined the Theatre Guild and began his acting career with a featured role in Green Grow the Lilacs (a short-lived play that eventually became the basis for the musical Oklahoma). With his thick Texas accent and storehouse of cowboy lore, Ritter quickly emerged as a New York sensation. He became greatly in demand for lecture recitals in eastern colleges on the cowboy and his song. During the fall of 1932, he was the featured singer with the Madison Square Garden Rodeo and from there went on to a recording contract with ARC and a program slot on WOR entitled The Lone Star Rangers, one of the first western radio shows ever featured in New York City. From 1932 to 1936, he appeared on other New York stations, including the WHN Barn Dance, where he acted as cohost with Ray Whitley. Then, inevitably, in 1936, he made the first of several movies, Song of the Gringo. Ritter, however, was not a cowboy, but was instead a very believable interpreter of cowboy songs. Impressionable easterners were easily convinced that he came, not from a small East Texas community and a college background, but from a working cattle ranch. And Tex very skillfully lived up to the part.

Tex Ritter’s exploitation of the western theme was typical of what was happening all over the United States in the mid-1930s. From New York to California, individuals responded to the western myth, and “cowboy” singers and groups sprang up in all sorts of unusual places. “Western” became a rival and often preferred term to “hillbilly” as a proper appellation for country music. It is easy to understand, of course, why “western” would be preferred to the seemingly disreputable backwoods term. “Western,” specifically, suggested a music that had been developed by cowboys out on the Texas Plains or in the High Sierras; more generally, it suggested a context that was open, free, and expansive. In short, the term fit the American self-concept.

***

Listen to music writer Will Hermes’ interview with Bill Malone and Tracey Laird on the Longreads Podcast here (read as transcript).

Excerpted from Country Music USA. Copyright ©1968 by the American Folklore Society. Copyright © 1985, 2002, 2010, 2018 by the University of Texas Press. All rights reserved.

Smooth Spaces, Fuzzy Lives

Brian Lawless/PA Wire

Rachel Andrews | Brick | Summer 2018 | 18 minutes (4,831 words)

A photograph in an Irish newspaper depicts a member of the Garda Síochána shaking hands with his counterpart from the Police Service of Northern Ireland at one of the points where the territory of the Republic turns into that of Northern Ireland. The photograph, published in November 2015, seven months before Britain voted to exit the European Union, accompanies an article on plans for a “border corridor,” whereby police on both sides of the border can pursue fleeing criminals into each other’s region.

There’s a kind of joviality to the photograph: firm clasping of hands, big smiles. Behind the two men is the Irish landscape, rolling, misted, a river cutting through fields of green. The officers wear different uniforms, but the only obvious territorial demarcation, the only hint that they inhabit different countries, with different laws, health systems, and currencies, is a sharp change in road color, from black to sudden grey.

I remember this non-distinctiveness, the dawning awareness that I had crossed a boundary, from the many trips I took to Northern Ireland between 2007 and 2010, when I worked on an essay that documented the systematic demolition of the Maze prison, a story that presented itself symbolically and — as it turned out — all too simplistically as one of a settling of the past and a coming together for the future.

I never went North as a child. I remember a drawing in a newspaper depicting a map of Ireland. In the sliver of space that is Northern Ireland, the cartoonist had penned: “there be dragons.” In truth, it was worse than that. Ask me as a 6-year-old, a 12-year-old, about Northern Ireland and I would have responded: bombs and blood. Ask my young daughter today and she might look at you blankly. It means nothing to her, and that is a good thing.

There were ways it meant nothing to us too. I grew up in Cork, in the very south of Ireland, and that meant growing up a world away from bombs and blood. As children in the 1970s and 1980s, we were safe from soldiers in the back gardens, from streets we couldn’t walk down. But things filtered into our child worlds. From television: the dark loom of the watchtower, the helicopters, the aerial prison shots following the 1983 Maze escape; Gordon Wilson, who lost his 20-year-old daughter in the 1987 Enniskillen bombing: “I shall pray for those people tonight and every night.” Of the few discussions in school, I remember one: the classmate who had relatives in Belfast, and her upset, her anger, at our fear, our distancing and distaste.

As I got older and traveled in Europe, the easy comfort of that distancing — you and I are not alike — was undercut. “So where did you hide the bomb?” a French colleague joked when I worked for a summer at a hotel in Munich. “Until I met you, I thought all Irish people were savages,” a German girl told me during my Erasmus year in France. This was the early to mid-90s and everywhere I went, there it was. “We in Australia just can’t understand it,” said the visitor to my apartment in London. I still remember the insult of his bemusement and sincerity, as well as my own avoidance. As far as everyone on the outside was concerned, I was them and they were me. I knew better — I mean, was it not obvious?

The first time I went North was in 2000, two years after the Good Friday Agreement, after the Omagh bombing and the howl of anguish that went with it, after it became imaginable, almost normal, for me to drive in my tiny Southern-registered Fiat from Dublin to Belfast and back again as I researched a writing project on women working in politics in Northern Ireland. I was in my late 20s by then, and I wasn’t afraid in Belfast’s city center, which had the same familiar department store names as any British or Irish main street; nor on the Falls Road, which wrapped me in a warm blanket of tri-colors and Celtic symbols; but I felt heavy and intimidated as I made my way up the red, white, and blue pavements on Shankill Road to the offices of the Progressive Unionist Party, hesitant to speak in the corner store lest I betray myself through the soft spill of my Southern tones. But then this dissolved too, and seven years later, when I spent time interviewing former prison officers at the Maze, as well as the residents who had grown up beside the prison, all from a Unionist background, the sing of my Cork accent felt more like a benign curiosity than anything traitorous or threatening.

“Merging is dangerous,” writes Rebecca Solnit, “at least to the boundaries and definition of the self.” Is that why we wrestle against it so? The border with Northern Ireland, once a site of blocked roads and lookout towers, has evaporated, at least on the surface of things, but it remains a place of struggle, of contest, a tussle between those who wish to take it one way and those who would move in another direction, either within the boundaries of a unified Ireland or into the space of clarity that tells us where we end and they begin. The amorphous situation that has existed along the border for nearly twenty years, a fudge that has resisted discrete categorizations and that we seemed to have found a way to live with, or live within, is under pressure in the wake of the Brexit vote, as the clamor to once again define what we are and what we are not, begins to accelerate. We look for the solace of certainty, of knowing if we are one thing or the other, rather than allow ourselves to remain within the complicated, messy space of the both/and, a state made possible by the exhaustion left after thirty years of violence.

Hannah Arendt had a particular view of merging. As she searched out a meaningful concept of a Jew’s place in the world following the sundering caused by the Second World War, she ultimately rejected a form of Zionism that connected citizenship to ethnicity and tethered both to the boundaries of the nation-state. On the other hand, she wrote scathingly of those European Jews who would assimilate, who would ape the Gentiles in an effort to find their way into the ranks of the human, who would, she wrote in disgust, become “good Frenchmen in France,” “good Germans in Germany.” Arendt, you could say, had been one such good German. As a child she did not know that she or her family were Jewish; she learned of her ethnicity only through the anti-Semitic taunts of children on the street. But it was the shocking stripping of her German citizenship as an adult in the 1930s that ultimately woke her to the helpless vulnerability of the assimilated Jew and formed her conviction that Jews must stand defiantly aloof from the boundaries of nationality, turning instead toward the belonging of the citizen; the belonging that attaches to full and complete membership of a political community; the belonging that confers the right to meaningfully speak, act, and be heard in such a community; the belonging that means inhabiting a territory without subscribing to an overarching identity narrative. “Refugees driven from country to country represent the vanguard of their peoples — if they keep their identity,” she wrote. Today her sentiments do not appear so different from those of Dina Nayeri, an Iranian refugee who received U.S. citizenship at 15 and became a French national at 30, and who wrote in the Guardian that she had lost interest in the need to rub out her face as tribute for these benefactions. “As refugees, we owed them our previous identity. We had to lay it at their door like an offering, and gleefully deny it to earn our place in this new country. There would be no straddling. No third culture here,” she said, although a third culture appears to be the choice made by Arendt’s beloved Heinrich Heine, at least as she described it, which was to live as both a German and a Jew rather than deny his Jewishness as the price of belonging. “He simply ignored the condition which had characterized emancipation everywhere in Europe — namely, that the Jew might only become a man when he ceased to be a Jew,” she wrote.

Arendt came out of a Europe that had, she witnessed, conclusively intertwined national rights with human rights, which left her as mistrustful of a national, bordered identity as she was of the “abstraction” of any solemn notions of the inalienable rights of man. Heine, the Prussian-born poet and literary critic, came of age in the early nineteenth century, an era of political instability and contentiousness in his homeland; his conversion to Lutheranism was reluctant, regretted, and carried out only as the price of “admission into European culture.” In the early years of the twenty-first century, there was a feeling —I had the feeling — that Europe, at least on some parts of its continent, had found its way beyond these aspects of its shattering history and was on the turn toward the global and the flexible. In 2002, when I lived for a time in Paris, I could board a plane in France and emerge in Italy, where I could retrieve my bags and leave the airport without showing any identification, without queues or questions. This identity-less travel, the result of the then seven-year-old Schengen Agreement and so opposite to my conditioned, normalized experience of waiting in dutiful lines, gave me the very real sense of being a human in the world. The continent of Europe — the part of it that now had a common currency and permeable frontiers, and even onwards toward the rapidly opening East — felt magical, enlightened even, as if we were all in this together. The distinctions between us, forged through cultural, religious, and geographic experience, appeared shapeless now. I could be both Irish and European; I felt that I could, as Arendt wrote, “speak the language of a free man and sing the songs of a natural one.”

But there was a “them.” From my window in Stalingrad, the quartier in the north of Paris where I lived from September to Christmas, I watched men in jeans and jackets congregate outside in the early darkness of the winter evenings, lining up in huddled rows on a Friday for weekly prayers. I looked on, curious — what are they doing? — before I understood. This was one year before the Iraq War, which fractured the Arab world, but already and for long years it was not easy to be Muslim in France, even if you were the French-born descendant of those who had come in the 1950s and 1960s as part of the first wave of migrant workers from northern Africa who stayed in search of a better life; even if you identified as both Muslim and French, as really all, or at least so many, of such descendants do, and as French civic society, with its emphasis on the primacy of the citoyen, encourages — in theory anyway. The exclus, they used to call them. The excluded. If I lived in Stalingrad today, the men across the street would no longer be there; in 2011, politicians banned the saying of street prayers in Paris following far-right protests about creeping Islamization. Instead, near the street I lived on, under the bridge where the metro station lay, there would almost certainly be tents and other makeshift shelters constructed by refugees from Iraq, Syria, Afghanistan, and elsewhere, part of a new and different wave of migration that, along with the 2008 economic crisis, has upended all of Europe. In 2002, I also went to Greece on a reporting assignment. There was no graffiti then comparing Angela Merkel to Hitler; today many in the desperately-indebted country view the dominance of German capital as the source of their woes. In Italy, France, Germany, a radicalized electorate now supports nationalist parties, looks at the European Union with deep suspicion. We were never all in this together.

For the moment, I can travel from Ireland to Britain without a passport. For the moment, I can drive from Dublin to Belfast without stopping, as the road melts from the N1 to the A1 and the white and black sign informs me that speed limits are now being monitored in miles rather than kilometers. (How different to John McGahern’s experience in the early 1990s, recounted in the essay “County Leitrim: The Sky Above Us”: “There are ramps and screens and barriers and a tall camouflaged lookout tower,” he said of the border crossing at Swanlinbar in County Cavan. “A line of cars waits behind a red light. A quick change to green signals each single car forward. In the middle of this maze armed soldiers call out the motor vehicle number and driver’s identification to a hidden computer. Only when they are cleared will the cars be waved through. Suspect vehicles are searched. The soldiers are young and courteous and look very professional.”) By the time I will have finished writing this article, British Prime Minister Theresa May will have triggered Article 50, and the movements I have become used to taking between cities and countries will have been thrown into confusion. Since the terrorist attacks of November 2015, France has been in a state of emergency that includes a firm policing of its borders. For more than a year and a half, commuters travelling from Malmö in Sweden to Copenhagen in Denmark had to present their IDs. Temporary border controls have also been introduced by Germany, Austria, and Norway. Merging is dangerous. Those hoping for a united Ireland — and I am surely one — forget this. On his blog, the journalist and Northern analyst Andy Pollak notes that Andrew Crawford, the former special adviser to current Democratic Unionist Party (DUP) leader Arlene Foster, used to go through reports from one North-South body removing the phrase “all-Ireland.” Perhaps the action of deletion helped Crawford forge certainty, was part of an attempt to make sense of how he existed in time and space. Forging certainty helps us all as we construct both story and identity in order to figure out how to live, but certainty, or at least a fixed destination, gets us into trouble too: we blind ourselves to possibilities, to the creative potential that lies outside of the either/or, to what can happen when we follow Arendt, say, or Deleuze, that great demolisher of dualisms, into the space of the non-being, the uncertain, the becoming.

In her photographic series Kinderwunsch, Ana Casas Broda depicts her body in thrall to those of her children, an artist willing to lose herself in conversation with flux, with change, with overwhelm. The photos are intimate and direct. Casas Broda often stares unsmilingly at the camera: a candid, life-worn Olympia, her pregnant body naked and big, uncomfortable-looking with her second child, or scarred and slack following fertility treatment and birth. In one of the images, her children have marked her face and torso with crayon; she both encouraged this and passively accepted the results. “I am their canvas: they play with me and change me,” she said in an interview. Kinderwunsch means “desire to have children,” and Casas Broda submits, it appears to me, to the terror and the unknown of that primal desire. She tumbles downwards, inwards. In the photographs, her children clothe her in tissue paper, they cover her in Play-Doh. “I see their scribbles on my body as a symbol of how motherhood has changed me,” she said. What she is really depicting is dissolution (of a former self), symbiosis — and something else. In some of the images, she and her children appear as one, interwoven, but there are others where she is alone, or they are indifferent to her: a son plays a video game as she lies naked on a couch, in between mother and person, neither here nor there, her body nonetheless relaxed, strangely at ease in the moment.

Around the time I began my Maze project, I was experiencing the greatest disintegration of self I had ever felt. Crossing the border from North to South represented moments of enormous exhilaration and giddy freedom: dazed as I was, when I lay in a border hotel without the baby, who had just turned seven months, I thought that I could see a way back to myself, that the place where I ended and the child began, would somehow become obvious again, clearly defined. I was wrong about that: there was no going backward. There was no going forward either, at least not in the way I wanted or imagined. Since the birth of my daughter, I remain in limbo land, the borders of a self so carefully constructed over nearly four decades now shifting. She arrived and I disappeared, something like that anyway. The categories I had thought surrounded me have dissipated into confusion and nothingness, and that, if I think about it too much, can be terrifying. Did I turn into you, I used to ask her when she was a baby, or have you become me?

When Colm Tóibín walked the border between North and South in 1987, he bumped into questioning British soldiers; a blown-up bridge on a road that once led from Dublin to Enniskillen; and, in Derry, a march led by former DUP leader Peter Robinson, then in the ascendant. Tóibín feared opening his mouth during the march, lest the crowd (young men in the main, some drunk) spot him as a Southerner. Despite the disappearance of the island’s physical frontier, the hangover from these tensions remains. My friend, a middle-class Northerner from a Catholic background who has lived in Dublin for nearly twenty years, at times employs turns of phrase that leave me reaching for a Cockney rhyming slang dictionary. Yet she and I both use colloquialisms a person born in England will never have heard. Nonetheless, for a long time my friend was lost and lonely in Dublin, reluctant to move back to a society still undercut by a deep lack of trust but without solid ground in the cultural space of the South; she felt different. “I was different,” she told me, as I tried to grasp her feelings of statelessness. It’s not as if we are from different countries, I told myself, not really anyway. But the thing is, we are, both literally and metaphorically. The border has dissolved, but trauma, so deep, so wounding, cuts us off from one another, makes strangers of us in the same land, pulls me one way, pushes her another; trauma turns a society inward, and it has turned Northern Ireland, in the words of retired Oxford professor of Irish history Roy Foster, into more of its own little place than ever. What we have in common is this (and this is easy to write and hard to live): we are more the same than we are different.

The artist Rita Duffy grew up Catholic in a largely Protestant area of Belfast; she is the progeny of a Southern mother and a father whose own father, a Catholic from the Falls Road, died at the Somme. Her two great-uncles on her mother’s side supported and may have been actively involved in the 1916 rebellion, which ultimately led to Irish independence, a civil war, and the fracturing of the island of Ireland. “I was continually fluctuating between nationalities, between identities, curious to know could I somehow land up in the middle somewhere that satisfied me today,” she told a symposium I attended in 2016. In recent years, Duffy has established herself within the space of the liminal: “I crept out to the edges of Ulster and we bought a little piece of the border. We built a house and I now have a studio just a mile and a half on the Southern side, and I live a mile and a half on the Northern side, so I kind of live in neither-here-nor-there land, which is a really interesting place to be as an artist. It’s very confusing and out of that springs the best imaginative possibilities for me.” Out of those imaginative possibilities have arrived big, bold ideas. The Titanic passenger liner was built in Belfast; the tragedy of its unfulfilled promise can be viewed as a metaphor for the long years the North lost to violence. In 2005, Duffy founded Thaw, a company set up to fund the towing of an iceberg from the Arctic to Belfast, where it would be moored outside the city and allowed to melt, in the process encouraging the shrinking of the deep, frozen divisions that still exist within Northern Irish society. Duffy has not yet found the means to drag her iceberg to Belfast, but since 2003 her paintings have been replete with the mythology of those hulking, frozen structures. She has created figurative images that appear trapped, encased in ice: Father Edward Daly, crouching, waving his handkerchief; a close-up of an arm, gesturing, holding a white handkerchief that may itself be an iceberg in miniature; in another painting, there is a Pieta, a mother holding a dying son, both emerging out of the bulk of an ice structure.

Duffy paints these images in greys and yellows, sometimes browns or greens, always muted. But in the middle, or in the distance, there is something, a speck of brightness, a blob, the white-grey of Father Daly’s handkerchief-iceberg, the light that draws your eye and that looks and feels like a breath of gulping air. If you thaw the frozenness, if you let it melt into the Irish Sea, then a space can open up, the iceberg no longer blocks your view and holds you in its frozen time. Behind you lies the city, with its plurality of people, before you the sky and the vastness of the ocean, deep and bold and cerulean blue. Duffy’s iceberg queen, a mammoth, back-turned Victoria, ascends into that blue, the blue of space, the blue of a possibility that allowed for an impossible friendship during the short time that former IRA member Martin McGuinness and the once-trenchant defender of Unionism, Reverend Ian Paisley, worked together in government in Northern Ireland. If you thaw the frozenness, a space opens up, and into that space walked Ian Paisley Jr., son of the good preacher, on various radio stations in January 2017, offering “humble and honest thanks” to Martin McGuinness on the occasion of the latter’s retirement.

In the North Atlantic, the largest iceberg on record was measured at 550 feet above sea level, the same height as a 55-story building; less tremendous ice structures can still reach more than 200 feet high. The Titanic, travelling at top speed on a calm night, crashed into an iceberg that was more than a mile long and 100 feet high and had been growing into its dense mass of packed ice since the time of Tutankhamun, although once such an iceberg drifts from the Arctic to the warmer waters of the North Atlantic, which this one had, it will normally melt in two or three years. To an impatient human eye, this melting will be imperceptible until it is close to completion. My daughter likes to play with ice cubes; she takes them from her glass of water and lays them on the table, where she can contemplate their light, their translucence. When she first started this game, I watched benignly; these days I place a tissue or napkin on the table to soak up the water that spreads out so suddenly as the cube, whole only a moment before, turns liquid before our eyes.

In Bosnia, where I’ve been doing research, the iceberg is still solid, a mountainous whole that blocks ethnicities from seeing across to each other. The Bosnian peace deal of 1995 somehow managed to avoid the formation of literal borders; instead, the populace has retreated into different enclaves across the country, Muslims stick with Muslims, Serbs with Serbs, and so forth. The saddest example of this is Sarajevo, which now sits within the Federation of Bosnia and Herzegovina, one of two political entities that compose Bosnia. Sarajevo’s population is now almost 90 percent Muslim, many of them newcomers since the ending of the war; former residents, most of whom will never go back, mourn the city that was once multi-ethnic and cosmopolitan. Everything has changed, they will tell you, shaking their heads; the city is totally different. There is still separatist agitation, particularly in the Republika Srpska, the political entity that sits closest to Serbia proper, whose nationalist leaders threaten to form their own tiny state, but the frozen iceberg contains more than that: it holds the pain of deceit, of mistrust, of horrors, of loss, of history and geography, of denial and defense. Most of my time in Bosnia was spent in the Republika Srpska, in the east of the country, where I found myself crossing and recrossing the border with Serbia. At each crossing, I encountered the checkpoints: the wait, the documents handed over, the computer clearance, the questions on occasion, the stamps. My husband, a photographer, made the crossing alone once and was held for two hours while guards went through his equipment, his backpack, his wallet. The determined absorption of different religions and cultures into the shape of a single Yugoslavia after the First and Second World Wars had its problems too, but those Bosnians old enough to remember the time when the many amalgamated into the one speak of it wistfully, softly, as if it were a fairy tale they used to tell themselves as children. Their iceberg was waiting, biding its time out at sea, before it floated inland to lodge itself forcibly among them. The disappearance of the border between North and South Ireland has not sunk our icebergs. But over the past sixteen years, until the schism that was June 23, 2016, we had found ways to float one with the other, moored and not, comfortable and not, settled and not.

Is it possible to hold two contrary ideas at the same time: that sense that merging is both terrifying and monumental, the knowledge that we are all different, but that we live within a common world, that we can choose to be something and not? Although Alice Notley wrote that the birth of her child had left her “undone” — “feel as if I will / never be, was never born” — she could still see the other side: “Of two poems one sentimental and one not / I choose both / of his birth and my painful unbirth I choose both.” She hung in the balance, remained midway, gave herself over to not settling in. The child that was a baby when I began my Maze project recently turned ten and is in process, in transition. She is a self that I am not, although that self, according to Deleuze and Guattari, is only “a threshold, a door, a becoming between two multiplicities.” Her identity is no more fixed than mine is, than mine ever was, for all that I have scrambled to chase it. “What is real,” write the philosophers, “is the becoming itself. The only way to get outside the dualisms is to be-between, to pass between, the intermezzo.” There is confusion, and much relief, in such malleable thinking.

I was wrong, of course, in the assumptions I made about the Maze story. After the initial openness that followed the prison’s closure in 2000, when the paramilitary prisoners were let out and the public allowed in, political wrangling slowly strangled the goodwill until the great gates swung shut again; they have stayed shut, more or less, ever since. When I talked to people in and around the prison about politics and the peace, they felt bitter and hurt and sad, and that was not easy to hear. But any hard edges of fear and certainty seemed also to have blurred into a resignation that meant we could at least stand outside of compartmentalizations and inside the fuzzy space that doubt tends to uncover.

It was almost always cold at the Maze; even during summer, the fog hung heavy over its vast flatness. When I was in need of warming, I would retreat to the small security hut at the entrance to the site, where a handful of guards took phone calls and processed visitors. What I recollect about these visits are the moments of recognition. One of the men, a gentle soul, English-accented, who had lost his wife too early and now lived a simple life of work and extended family, was the chatty type. I still remember how he once articulated my fear. “You dip your finger in a pool of water, swirl it about for a while, and when you take it out, the water will return to the way it was. Then it will be as if you never were.”

***

This essay first appeared in Brick, the biannual print journal of nonfiction based in Canada and read throughout the world. Our thanks to Rachel Andrews and the staff at Brick for allowing us to reprint this essay at Longreads.

 

The Wheel, the Woman, and the Human Body

CTK via AP Images

Margaret Guroff | The Mechanical Horse | University of Texas Press | April 2016 | 35 minutes (4,915 words)

Angeline Allen must have been pleased. On October 28, 1893, the 20-something divorcée, an aspiring model, made the cover of the country’s most popular men’s magazine, a titillating journal of crime, sport, and cheesecake called the National Police Gazette. Granted, the reason wasn’t Allen’s “wealth of golden hair” or “strikingly pretty face,” though the magazine mentioned both. Rather, the cover story was about Allen’s attire during a recent bicycle ride near her Newark, New Jersey, home. The “eccentric” young woman had ridden through town in “a costume that caused hundreds to turn and gaze in astonishment,” the Gazette reported.

The story’s headline summed up the cause of fascination: “She Wore Trousers” — dark blue corduroy bloomers, to be exact, snug around the calves and puffy above the knees. “She rode her wheel through the principal streets in a leisurely manner and appeared to be utterly oblivious of the sensation she was causing,” according to the reporter.

It is unlikely Allen was truly oblivious, having already shown an exhibitionistic streak over the summer when she appeared on an Asbury Park, New Jersey, beach in a bathing skirt that “did not reach within many inches of her knees,” according to a disapproving newspaper report. (“Her stockings or tights were of light blue silk,” the report added.) Allen didn’t mind people noticing her revealing outfits — “that’s what I wear them for,” she told one reporter — and she kept cycling around Newark in pants despite the journalistic scolding. As another paper reported that November, “The natives watch for her with bated breath, and her appearance is the signal for a rush to all the front windows along the street.”

For a grown woman to reveal so much leg in public was a staggeringly brazen act. What was noticeably unnoteworthy by then was Allen’s choice of vehicle. Ten years earlier, all bicycles had been high-wheelers, and riding one had been largely the province of daring, athletic men. The women who had attempted it were seen as acrobats, hussies, or freaks; one female performer who rode a high-wheeler in the early 1880s was perceived as “a sort of semi-monster,” another woman reported. But by the early 1890s, the bike had undergone a transformation. Allen’s machine — a so-called safety bicycle — had two thigh-high wheels; air-filled rubber tires; and rear-wheel drive, with a chain to transmit power from the pedals. In fact, it looked a lot like a 21st-century commuter bike, and it had become nearly as acceptable as one. Even the fashion police who scorned Allen’s riding outfit didn’t object to her riding.

What had happened to the bicycle in the interim? Market expansion. In the 1880s, when bicycle makers had begun to saturate the limited market for high-wheelers, they sought products to entice other would-be riders, particularly men who had aged out of the strenuous high-wheel lifestyle. In the United States, where bad roads made tricycle ridership impractical, the sales potential for an easy-to-ride bicycle looked stronger than in Europe. In response, manufacturers on both sides of the Atlantic created a profusion of high-tech two-wheelers, including models with foot levers instead of pedals; “geared up” bikes with chains and sprockets that spun the driving wheel more than once for each rotation of the cycle’s cranks; and a supposedly header-proof version with the small wheel in the front and the big wheel in the rear. Riders and makers started calling the standard high-wheeler an “Ordinary” to distinguish it from experimental models.

Several of the new bikes used geared-up rear-wheel drive as a way to bring the rider closer to the ground. The most influential of these was the English Rover, with a rear driving wheel only thirty inches tall that had as much force as a 50-inch Ordinary wheel. (Even today, American bicycle gears are measured in “gear inches,” which indicate how tall an Ordinary wheel of equivalent force would be.) At 36 inches, the Rover’s front wheel was slightly bigger than its rear one, but apart from that, the machine looked as streamlined as some models of fifty or a hundred years later.

Introduced in England in 1885, the Rover Safety Bicycle delivered the speed of an Ordinary, but with a greatly diminished risk of skull fracture from flying over the handlebars. The Rover’s manufacturer made some quick refinements, and a model with same-sized wheels caught on in Britain and inspired a fleet of imitators: low-mount, rear-wheel-drive bikes also called “safeties.”

The major US manufacturers weren’t impressed by this new low profile, though; they dismissed the safety style as a mistake. In 1886, after a two-month tour of England’s bicycle factories, the US industry titan Albert Pope expressed confidence in his high-wheeler: “I looked at nearly all the principal [English] makes and I could not find a point that was in any way an improvement over our own.” Echoed his lieutenant, George H. Day, who also made the trip, “Every innovation is regarded as a trap.”

But when imported safeties hit the US market in the spring of 1887, the machines found eager buyers; Pope and other American cycle makers scrambled to put out their own versions of the header-resistant contraptions. By November, the safety bicycle was established in the United States as the modern option for men, even though its low wheels evoked the comically old-timey velocipede of 20 years prior, as one bard made clear in the accented voice of an immigrant child:

In days of old, full many a time
You’ve heard it told, in prose and rhyme,
How down the street a wheelman came,
And chanced to meet his beauteous flame
Just where a pup in ambush lay,
To tip him up upon the way,
And make him wish that he was dead,
While gyrating upon his head.
In days of old
You’ve heard it told.
But nowadays, it’s otherwise.
The safety craze new joy supplies;
The boulders lose their terrors grim,
Stray cans and shoes are naught to him;
He laughs at rocks, he kicks the pup,
But, in the end, things even up;
For, as his maid he gayly greets,
Some unwashed urchin always bleats —
“Hi, look at der big man on der melosipetes!”

For a short time, Ordinaries and safeties coexisted like Neanderthals and Homo sapiens, with the bigger, older species continuing to inhabit its traditional niche while the smaller, nimbler creature carved out a new one. “I do not think that [the safety] will hurt the sale of the Ordinary bicycle,” predicted one US industry watcher in late 1887. “It will open the pleasures of cycling to a great many who have been afraid to venture upon a high machine.” The writer was thinking of physicians and other “professional men” for whom an Ordinary was too dangerous, but some enthusiasts suspected that the safety would also appeal to female riders. Offering women “a clumsy wheelbarrow of a tricycle” to ride while men zip around on slender bikes, wrote one sympathetic man, “is offering a woman a stone to eat while men have soft biscuit.”

And the safety bicycle’s low profile did intrigue many American women, especially after the spring of 1888, when makers offered a drop-frame version, in which the bike’s top bar scooped downward to make room for a lady rider’s long skirts. As one woman reported that year, “A sudden desire began to awake in the feminine mind to ascertain for itself by personal experience, what were those joys of the two-wheeler which they had so often heard boastfully vaunted as superior, a thousand times, to the more sober delights of the staid tricycle.”

With the safety’s smaller wheels, its ride was bumpier than the Ordinary’s at first. But then came the pneumatic tire. Devised in Ireland in 1888 by a veterinarian named John Boyd Dunlop, who was seeking a faster ride for his son’s trike, the air-filled rubber tube cushioned the road’s ruts and bulges in a way that springs and other early shock-absorbing devices never could. This marvel arrived in the United States by 1890 and became standard equipment on American safeties within a few years. “It permitted travel on streets and roads previously thought unrideable,” recalled an American journalist of the time, “and added to cycling a degree of ease and comfort never dreamed of.”

In the 1890s, bikes got lighter as well as more comfortable. The average weight of a bicycle dropped by more than half during the decade’s first five years, falling from 50 pounds to 23. And since new gearings were able to mimic wheels larger than those of the largest Ordinary, speed records fell too. In 1894, while riding a pneumatic-tired safety around a track in Buffalo, New York, the racer John S. Johnson went a mile in just over one minute and thirty-five seconds, a rate of nearly thirty-eight miles an hour. He beat the previous mile record for a safety by fourteen seconds, and the record for an Ordinary by nearly a minute–and the record for a running horse by one-tenth of a second.

The Ordinary — which had by then acquired the derisive nickname of “penny-farthing,” after the old British penny and much smaller farthing (quarter-penny) coins ─ became obsolete. High-wheelers that had sold for $150 to $300 just a year or two earlier were going for as little as $10.

The first safeties, meanwhile, cost an average of $150 during a time when the average worker earned something like $12 a week. At such prices, the new bikes targeted the same upscale demographic as the tricycle. But a strong market for safeties among well-to-do women goosed production, and competition among manufacturers reduced prices, making the bikes affordable to more would-be riders — and further fueling demand. In 1895, America’s 300 bicycle companies produced 500,000 safeties at an average price of $75, according to one encyclopedia’s yearbook. Even manufacturers were surprised at the demand among women, who thrilled to the new machine’s exhilarating ride. As one female journalist wrote, “If a pitying Providence should suddenly fit light, strong wings to the back of a toiling tortoise, that patient cumberer of the ground could hardly feel a more astonishing sense of exhilaration than a woman experiences when first she becomes a mistress of her wheel.”

It wasn’t just that women enjoyed the physical sensation of riding — the rush of balancing and cruising. What made the bicycle truly liberating was its fundamental incompatibility with many of the limits placed on women. Take clothing, for example. Starting at puberty, women were expected to wear heavy floor-length skirts, rigid corsets, and tight, pointy-toed shoes. These garments made any sort of physical exertion difficult, as young girls sadly discovered. “I ‘ran wild’ until my 16th birthday, when the hampering long skirts were brought, with their accompanying corset and high heels,” recalled the temperance activist Frances Willard in an 1895 memoir. “I remember writing in my journal, in the first heartbreak of a young human colt taken from its pleasant pasture, ‘Altogether, I recognize that my occupation is gone.’” Reformers had been calling for more sensible clothing for women since the 1850s, when the newspaper editor Amelia Bloomer wore the baggy trousers that critics named after her, but rational arguments hadn’t made much headway.

Where reason failed, though, recreation succeeded. The drop-frame safety did allow women to ride in dresses, but not in the swagged, voluminous frocks of the Victorian parlor. Female cyclists had to don simple, “short” (that is, ankle-length) skirts in order to avoid getting them caught under the bicycle’s rear wheel. And to keep them from flying up, some women had tailors put weights in their hems or line their skirt fronts with leather. Other women, like Angeline Allen, shucked their dresses altogether and wore bloomers. The display that reporters had deemed shocking in 1893 became commonplace just a few years later as more and more women started riding. “The eye of the spectator has long since become accustomed to costumes once conspicuous,” wrote an American journalist in 1895. “Bloomer and tailor-made alike ride on unchallenged.” (For her part, Allen may well have given up riding, but not scandal; she progressed to posing onstage in scanty attire for re-creations of famous paintings, a risqué popular amusement.)

Bicyclists’ corsets changed too, though less publicly. The corset of the 1880s was an armpit-to-hip garment stiffened with whalebone stays, which helped the hips support heavy skirts that hung from the waist. But while corsets braced women’s torsos, they also weakened their wearers, squeezing women’s lungs and displacing other internal organs, making deep breaths impossible. Out of necessity, female cyclists looked for alternatives, and many chose another garment that had been advocated by dress reformers decades earlier: a sturdy, waist-length cotton camisole with shoulder straps. When introduced in the 1870s, this garment was called an “emancipation waist,” and it featured a horizontal band of buttons at the hem, to which drawers or a skirt could be attached. Later versions were named “health waist” or, finally, “bicycle waist.” One 1896 model included elastic insets; its maker promised the wearer “perfect comfort — a sound pair of lungs — a graceful figure and rosy cheeks.” All for $1, postpaid.

If women’s clothing constrained them, so did their role in society. More Americans than ever worked outside the home; by 1880, farmers made up a little less than half of the country’s labor force. But even among the urban working class, married women typically stayed home during the day to cook, clean, tend to children, and often manufacture homemade goods for sale. Meanwhile, their husbands, sons, and unmarried daughters toiled in factories, shops, offices, and other people’s houses. Many Americans came to believe that men and women naturally inhabited two separate spheres: men held sway in business, politics, and other public arenas, and women took charge of the home. For most middle-class women, respectability meant appearing in public only under certain circumstances ─ such as while shopping ─ and making as small an impression as possible. “A true lady walks the streets unostentatiously and with becoming reserve,” instructed an 1889 etiquette manual. “She appears unconscious of all sights and sounds which a lady ought not to perceive.”

In addition, an unmarried young woman didn’t go out without a chaperone, usually an older female relative. Being seen on an unchaperoned date, even at a restaurant or other public place, could be cause for social ruin. An 1887 etiquette guide warned against sailing excursions, for example, lest the boat be becalmed overnight: “A single careless act of this sort may be remembered spitefully against a girl for many years.”

The bicycle challenged all that. Wives who had stayed close to home — venturing out only on foot, by trolley, or, if wealthy, with a driver and horse-drawn carriage — were suddenly able to travel miles on their own. Being so mobile, and so visible, was a revelation to many. “The world is a new and another sphere under the bicyclist’s observation,” wrote one female journalist. “Here is a process of locomotion that is absolutely at her command.” If a woman’s sphere begins to feel too small, wrote another, “the sufferer can do no better than to flatten her sphere to a circle, mount it, and take to the road.”

As for unmarried women, manners mavens urged them to cycle only with chaperones, but the rule didn’t take. “New social laws have been enacted to meet the requirements of the new order,” reported one newspaper editor in 1896. “Parents who will not allow their daughters to accompany young men to the theatre without chaperonage allow them to go bicycle-riding alone with young men. This is considered perfectly proper.” According to the editor, the reason for this difference was the “good comradeship” of the bicycling set. Fellow enthusiasts looked out for one another on the road, he wrote ─ so in a way, every ride was supervised. The historian Ellen Gruber Garvey suggests a second possible reason: propriety already allowed unmarried women to ride horses unchaperoned. Bicycles, as a less costly equivalent, may simply have extended this freedom down the economic scale.

But the same things that made the bicycle liberating also made it threatening. Moralists warned that skimpy costumes and unsupervised travel would lead to wanton behavior. “Immodest bicycling by young women is to be deplored,” declared Charlotte Smith, founder of the Women’s Rescue League, a group that lobbied Congress on behalf of “fallen women.” “Bicycling by young women has helped to swell the ranks of reckless girls, who finally drift into the standing army of outcast women.” Smith reported that her tours of brothels and interviews with prostitutes confirmed this.

Physicians — who at the time shouldered responsibility for patients’ moral as well as physical well-being — had their own concerns. One visited New York’s Coney Island and saw a 16-year-old cyclist get drunk on wine provided by a beautiful but nefarious older woman. “She looked like an innocent child, but was away from home influence,” the doctor reported. Many physicians fretted that pressure from the bicycle seat would teach girls how to masturbate, a practice thought to lead to spiritual and psychological decline. Climbing hills on a bike could excite “feelings hitherto unknown to, and unrealized by, the young girl,” wrote one doctor in 1898. (Boys faced the same danger: pressure on the perineum would call their attention to the area, warned one doctor, “and so lead to a great increase in masturbation in the timid [and] to early sexual indulgence in the more venturous.”)

The bicycle’s peril was medical as well as moral. In the late nineteenth century, many saw physical energy as a finite resource that had to be carefully parceled out, not a power that could be renewed through exercise. The fashionable malaise of neurasthenia was only one of the disorders thought to be caused by a depletion of energies. Overexertion could also cause tuberculosis, scoliosis, hernias, heart disease, and other maladies, doctors believed. Safely sedentary middle-class women, who frequently suffered from varicose veins and other consequences of annual pregnancies, were prone to fatigue; one Boston writer called them “a sex which is born tired,” adding that “society sometimes seems little better than a hospital for invalid women.” Particularly for women in heavy dresses and constricting corsets, any activity that raised the heart rate could seem more likely to be the cause of fainting and listlessness than their remedy. Opponents of the bicycle latched onto this perception, arguing that riding would cost women more effort than they could afford. “The exertion necessary to riding with speed … is productive of an excitation of nervous and physical energy that is anything but beneficial,” Charlotte Smith warned. “If a halt is not called soon, 75 percent of the cyclists will be an army of invalids within the next ten years.”

But even as Smith made her dire predictions, Americans’ fear of cardiovascular exercise was beginning to lift. For decades, health reformers had trumpeted the benefits of fitness, and during the 1880s, the United States saw a spike in organized physical activity. Citizens of America’s growing cities tried new sports such as baseball and football, and exercise advocates built the first public playgrounds and pushed for physical education for both boys and girls. Doctors continued to caution against overexertion, but they acknowledged that, in moderation, fresh air and exercise tended to improve patients’ health. The high-wheel bicycle of the 1880s proved the benefits of regular exercise to those who could ride it; proponents made extravagant claims for the risky machine’s ability to restore well-being. “For constipation, sleeplessness, dyspepsia, and many other ills which flesh is heir to, not to speak of melancholy,─all are curable, or certainly to be improved, by the new remedy, ‘Bicycle,'” wrote a Texas physician in 1883. “It is always an excellent prescription for the convalescents, and nearly always for chronic invalids.”

Not everyone could take the prescription, though. High-wheeled cycling and rigorous team sports were acceptable only for young men. The new games deemed suitable for mixed company, such as lawn tennis and golf, were far less taxing — and therefore far less likely to lead to noticeable improvements in fitness. As for working out on your own, the recommended options were either too costly (horseback riding) or too boring (indoor calisthenics) to gain much popularity. As a result, many more Americans of the 1880s thought they ought to exercise than actually did it. So when the safety bicycle appeared at the end of the decade and Americans began riding in large numbers — an estimated two million by 1896, out of a population of about seventy million — few were certain how such vigorous physical activity would affect them.

Doctors were wary. Most US physicians believed that each patient’s condition was based largely on his or her habits and experiences, the weather, and other environmental factors. Good health was a reflection of proper balance among bodily systems and energies. “A distracted mind could curdle the stomach, a dyspeptic stomach could agitate the mind,” writes the medical historian Charles Rosenberg. It was a doctor’s job to know each patient well enough to restore balance when something was out of whack, using laxatives, diuretics, and other purging drugs to reboot the system. Even contagious diseases could not be treated in a cookie-cutter fashion, argued an 1883 medical journal editorial: “No two instances of typhoid fever, or of any other disease, are precisely alike … No ‘rule of thumb,’ no recourse to a formula-book, will avail for proper treatment even of the typical diseases.” To many doctors, advocating a specific drug to cure a specific disease seemed the height of quackery.

And just as there were no one-size-fits-all medical treatments, many physicians believed there were no one-size-fits-all exercise routines. While cycling enthusiasts rhapsodized about the safety bicycle’s benefits for riders of both sexes and all ages, doctors fretted that many of their patients would be harmed by the new machines. Even seeming success stories were suspect. In an 1895 paper on heart disease, one doctor reported that a patient who had panted for breath after climbing one flight of stairs was now able to cycle up hills with ease. “It would be wrong to conclude from this that cycling is not injurious,” the doctor wrote: there hadn’t yet been time to observe the bicycle’s long-term effects. Moreover, as an unfamiliar activity, cycling tended to catch the blame for pretty much anything bad that happened to a new rider afterward, up to and including death.

Logically, acute injuries were a concern. Though the safety bicycle did greatly reduce the risk of head wounds, it didn’t obliterate that risk, particularly among “scorchers” — thrill-seeking youngsters who hunched over their handlebars and pedaled as fast as they could. “It might seem almost impossible to fracture a skull thick enough to permit indulgence in such practices,” reported the Boston Medical and Surgical Journal, “but the bicycle fool at full speed has been able to accomplish it.” Medical journals also noted the danger of road rash and broken bones.

More insidious than crash injuries, though, were new chronic complaints attributed to cycling. The bent-over posture of the scorcher was thought to cause a permanent hunch called “kyphosis bicyclistarum,” or, familiarly, “cyclist’s stoop.” Repeated stress to the cardiovascular system — that is, regular workouts — could lead to the irregular heartbeats and poor circulation of “bicycle heart.” Gripping the handlebars too tightly might cause finger numbness, or “bicycle hand,” and a dusty ride could trigger “cyclist’s sore throat.” Practically every body part seemed to have its own cycle-related malady; at least one New York doctor devoted his entire practice to treating such ailments.

Of all the physical woes attributed to the bike, the one that most strained credulity was the “bicycle face.” Characterized by wide, wild eyes; a grim set to the mouth; and a migration of facial features toward the center, the disorder was said to result from the stress of incessant balancing. A German philosopher claimed that the condition drained “every vestige of intelligence” from the sufferer’s appearance and rendered children unrecognizable to their own mothers. The bicycle face hung on, too, warned a journalist: “Once fixed upon the countenance, it can never be removed.”

The doctors raising these alarms were careful to state that many of the new diseases affected only cyclists predisposed to them — which would explain why so few of their fellow physicians might have encountered the disorders. “Whilst thousands ride immune, a small percentage will suffer,” wrote one doctor. Another, who blamed cases of appendicitis, inflammatory bowel disease, and the thyroid condition Graves’ disease on excessive riding, said it didn’t matter how many people believed that cycling had improved their health: “It would not affect my argument in the least if swarms of them had been rescued from the grave.”

Nevertheless, the more Americans took to bicycling, the more tenuous these claims of danger came to seem. The machine made physical activity both practical and fun. “The bicycle is inducing multitudes of people to take regular exercise who have long been in need of such exercise, but who could never be induced to take it by any means hitherto devised,” one doctor wrote in Harper’s Weekly in 1896. And all that activity had an effect. Riders quickly noticed improved muscle tone, increased strength, better sleep, and brighter moods. Women, especially, transformed themselves, wrote the novelist Maurice Thompson in 1897: “We have already become accustomed to seeing sunbrowned faces, once sallow and languid, whisk past us at every turn of the street. The magnetism of vivid health has overcome conservative barriers that were impregnable to every other force.”

The empirical evidence of cycling’s health value began to overtake conservative doctors’ concerns, as the rhetoric scholar Sarah Overbaugh Hallenbeck argues. Though many physicians continued to raise objections to the sport, their voices were increasingly drowned out by those of more observant — and pragmatic–practitioners. “The bicycle face, elbow, back, shoulders, neck, eroticism,” wrote one military doctor in 1896, “I pass as not worthy of serious consideration.” Rather than discourage bicycle use, most physicians came to cautiously endorse it. “So long as the cyclist can breathe with the mouth shut,” wrote one such doctor in 1895, “he is certainly perfectly safe.” Some went further, citing evidence of the bike’s benefits for heart patients, migraine sufferers, diabetics, and others with chronic conditions. In Chicago, the demand for injectable morphine dropped as patients with anxiety or insomnia “discovered that a long spin in the fresh air on a cycle induces sweet sleep better than their favorite drug,” the Bulletin of Pharmacy reported.

This shift paralleled a transformation in medical thinking during the 1890s, when American physicians increasingly embraced the scientific method. Some clinics in Continental Europe had adopted this evidence-based approach early in the nineteenth century, using statistics to determine the efficacy of treatments and evaluating patients’ conditions according to universal norms, rather than trying to divine what was normal for each individual patient. In the United States, however, doctors arguing for this approach were long in the minority. According to Rosenberg, the rift between medical traditionalists and empiricists “provided an emotional fault line which marked the profession throughout the last two-thirds of the century.” Only at the very end of the nineteenth century did a research-based, objective philosophy take hold at US medical schools.

It would be folly to suggest that the bicycle alone caused this transformation. Many other factors were at play, such as improved trans-Atlantic communication; an influx of European immigrants, including scientists; and a snowballing of evidence for new medical concepts such as the germ theory of disease. For centuries, Western healers had believed that contagion could erupt spontaneously, but between 1870 and 1900, researchers disproved this theory by isolating the microscopic causes of illnesses including typhoid, tuberculosis, cholera, diphtheria, meningococcal meningitis, plague, and malaria.

But even if the bike did not independently modernize American medicine, its unprecedented impact on fitness — and the clash this revealed between what doctors said and what experience showed — may well have accelerated the shift. Much as the bicycle triggered changes in women’s dress that high-minded advocacy could not, it bolstered scientists’ then-radical argument that what is good for one human body tends to be just as good for another.

To the bicycle faithful of the 1890s, this seemed to be just the beginning of the changes that the machine would bring about. The gulf between social classes would recede under the influence of this “great leveler,” one enthusiast wrote in the Century Magazine: “It puts the poor man on a level with the rich, enabling him to ‘sing the song of the open road’ as freely as the millionaire, and to widen his knowledge by visiting the regions near to or far from his home, observing how other men live.”

And while women may not yet have had full access to higher education ─ or even the right to vote — the unchaperoned, self-propelled bloomer girl seemed to be pedaling in that direction. “In possession of her bicycle, the daughter of the 19th century feels that the declaration of her independence has been proclaimed,” wrote one female journalist, “and, in the fulness of time, all things will be added to complete her happiness and prosperity.”

The first-wave feminist Susan B. Anthony was born in 1820, the year after Charles Willson Peale built his iron draisine. By the time of the safety bicycle boom of the 1890s, she was a snowy-haired eminence, too old to risk riding, but she had an opinion of the sport. “I’ll tell you what I think of bicycling,” she said in an 1896 newspaper interview as she leaned forward to lay a hand on the reporter’s arm. “I think it has done more to emancipate woman than any one thing in the world.”

***

From The Mechanical Horse: How the Bicycle Reshaped American Life. Copyright © 2016 by Margaret Guroff. All rights reserved, with permission of the University of Texas Press.

How the Self-Publishing Industry Changed, Between My First and Second Novels

Photo: Nicole Dieker

As of this writing, my self-published novel The Biographies of Ordinary People: Volume 2: 2004–2016 is currently ranked #169,913 out of the more than one million Kindle books sold on Amazon. When Biographies Vol. 2 launched at the end of May, it ranked #26,248 in Kindle books and #94,133 in print books. At one point my book hit #220 in the subcategory “Literary Fiction/Sagas.”

So far, Biographies Vol. 2 has sold 71 Kindle copies and 55 paperbacks, which correlates to about $360 in royalties.

I know what you’re thinking, and you’ve probably been thinking it since you saw the words “self-published.” But no, those sales numbers aren’t because my books are terrible—and I didn’t self-publish because my books were terrible either. (It’s a long story, but it has to do with an agent telling me that I could rewrite Biographies to make it more marketable to the traditional publishing industry, or I could keep it as an “art book” that would be loved by a select few.) Last year’s The Biographies of Ordinary People: Volume 1: 1989–2000 was named a Library Journal Self-E Select title; Vol. 2 was just selected as a Kirkus Reviews featured indie, with the blurb “A shrewdly unique portrait of everyday America.” I regularly get emails from readers telling me how much my books have meant to them, and how they couldn’t put their copies down.

So. I could tell you a story that makes The Biographies of Ordinary People sound like a triumphant success, and I could also tell you that in its first year of publication, Biographies Vol. 1 sold 382 ebooks and 157 paperbacks, earning $1,619.28 in royalties. Read more…

City on a Hill

Getty / Photo illustration by Katie Kosma

Leslie Kendall Dye | Longreads | June 2018 | 11 minutes 2,944 words)

 

At the top of Riverdale, at the top of the Bronx, there is a city on a hill. The city exists within a single building; there are single rooms with no locks, each with a bed, a dresser, and — if the resident’s family provides one — a television set with which to while away the hours. Time is measured by the same clock as it is in other cities, but here it curves and collapses, compresses yet languorously stretches. Once a week there is a hairdresser and a manicurist, too. It is lovely — and dreadful. You can visit the citizens here, and you are free to leave when you are ready, if freedom is measured by the movement of one’s feet. My mother lives in this building, which is a nursing home. We signed the contract for her just last month, in what might have been human blood.

The city-building overlooks the Hudson River, which today glimmers silver under a portentous sky. It’s spring by the calendar, but winter has persisted in the Northeast. I trudge west from Riverdale Avenue bundled into my down coat, the wind biting at my neck. I like to pretend that my mother is expecting me.

She has lived there only a few days. I am acquainting myself with the place every time I visit. She lives on one of the dementia floors, the medium security floor, with other people who are social and display a level of intellectual competence that affords them the illusion of freedom — they do not require help from aides with dressing and bathing, for example, and they may choose where to sit at dinner.

She lives on one of the dementia floors, the medium security floor, with other people who are social and display a level of intellectual competence that affords them the illusion of freedom…

There is a code to gain entry to the elevator, and another to make the elevator move. We are asked not to let the residents know these numbers, although this seems to miss the point; no one who lives here could retain the numbers long enough to use them. Still, I take the piece of paper on which the nurse has written the codes and stash it deep in my coat pocket, checking it discreetly before punching in the numbers.

The unit has a hospital floor plan, which casts a gloom over the space, a reminder that this is a ward, not a home. Still, the idea of a central nurses’ station affords some comfort — someone is just down the hall in case of emergency. My mother has one endless emergency here — her own urgent need to leave.

She looks up eagerly when I cross the threshold bearing my weekly gifts — this time, a CD player, some photos to hang, cookies, fresh underwear and socks. Everything she owns must be labeled; dementia-floor residents can be found in each other’s clothing routinely. “You have to have a sense of humor about it,” my mother’s social worker tells me.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Her room has a river view. I wonder what it’s like not to know which body of water it is that one sees through the glass; not to know that the sun will set over this water because one is facing west; not to know which way the bathroom is or what time it is or how to find the phone; not to remember the combination of numbers that will allow you to reach your children; to know you have children, but not to remember their names.

“It’s so large,” my mother says, as we stroll down the hall, gazing at the paintings on the walls. We take the elevator to the mezzanine, where we can get some food. We walk down another long hall, passing a small pool, a gym, a spa. “Isn’t this nice, Mom?” I ask, and she nods agreeably.

I hear music, an accordion bleating out a melody in a minor key.

Those were the days, my friend
We’d thought they’d never end
We’d sing and dance forever and a day
We’d live the life we choose
We’d thought we’d never lose
For we were young and sure to have our way.

I smile; it is comically, tragically apropos for a nursing home. Still, I guide my mother toward the music, which is both lively and disturbing, as though accompanying the final sequence in a horror film.

We arrive at a small ballroom in which a crowd of mostly wheelchair-bound seniors sit, nodding to the music, enlivened and demonstrating as much to the height of their ability. It reminds me of bar mitzvahs I attended long ago — the wall-to-wall carpeting, the tinny music reverberating in the stale, enclosed space. I turn my head toward the door and notice a bird cage. It’s actually a glass enclosure, in which parakeets and cockatoos chirp and flit from one end to the other.

“Let’s find the café, Mom,” I say, and she replies that she will follow me anywhere.

I direct her out of the ballroom. We walk until at last I see sunlight. They call it the River Cafe; it looks like a bodega. Here we can buy cookies and toiletries and coffee. Booths line a glass wall affording a dazzling view of the water.

“Look at the view!” I have been saying this a lot today, as if the sight of the river were recompense for her confinement.

“Yes, it’s lovely,” she replies, “It’s very nice.”

She is uneasy, asking me constantly if after this tour I will be taking her “out of here.”

“Look at the view!” I have been saying this a lot today, as if the sight of the river were recompense for her confinement.

Yes, I tell her, we are going to my sister’s house, it’s not far at all, we are walking there.

“Thank god,” she says. “ I can’t tell you how happy I was to see you walk in.”

* * *

Almost no new memories imprint; I am struck by the specific details she retains: my entrance into her room, what she felt like at that instant, how desperate she felt just before. Will this crystallized moment be sent down the pipe to long-term memory? Or will she have it only for today?

We buy a cookie then leave the café. On the way to the elevator, a small room set up like a museum alcove catches my eye. Pickles and Egg Cream reads a sign overhead. It’s an exhibit of dioramas in which a woman named Ruby G. Strauss has recreated scenes from her parents’ years on the Lower East Side. I peer into a scene of passengers exiting the subway stop at Broadway and 14th Street, another of Strauss’s grandmother’s garden in summer, wedged between two tenements, a line of clothes drying above children skipping rope and a man in a straw hat reading the paper. There are dozens of little figures holding tiny props: a man drinking wine by a cathedral radio in his parlor, a bride and groom on their wedding day, a grandmother wearing wiry glasses, knitting.

Like the parakeets, the dioramas are too easy a metaphor. Life under glass. Life observed through glass. Life imprisoned within glass walls. I pull my mother out of the alcove. Her eyesight has been failing, so for her the exhibit is a blur.

I punch the code in and the elevator arrives. We emerge on the first floor and exit through the lobby, passing a collection of dolls made in the images of American First Ladies. I can see my mother’s reflection — her long coat and dark hair — in the glass that encases the dolls, moving swiftly and enthusiastically toward the lobby door.

A shock of cold wind hits as it slides open.

“It’s really a nice place,” I say. “In spring all these trees will bloom and they have barbecues in the garden.”

“Yes, I’m so lucky,” my mother says. “Are we leaving now?”

As we walk down the hill toward the guard booth, I think of an Isaac Asimov book I read in my youth. In Caves of Steel, he envisions a futuristic city complex where New York City once stood. It is entirely enclosed, without a drop of fresh air seeping into its midst, contained under metal domes.

The air is so qualitatively different outside the walls of the pavilion in which my mother now lives, which hums with the electrical energy of a well-run hotel. Its seamless wall-to-wall carpet obliterates any hint of nature, the scent of cateria food permeates the first floor corridors, the ring of elevator cars creates a perpetual dinging soundtrack in the lobby.

A siren goes off, as though a dog had jumped a security perimeter. It’s my mother’s electronic bracelet, which they’ve attached to prevent her from wandering off the property. I negotiate our departure without alerting my mother to this indignity.

We proceed eastward on a paved path. Alongside the path runs a tall metal fence that separates us from some tan, patchy grass — the sort that works as visual shorthand for the ravages of winter. I’m breathing better now, as is my mother, who has all morning complained of agitation.

“I was so glad when I saw you in the doorway,” she says again, as we walk into the wind toward Riverdale Avenue. We cross it and as we do, seem to travel through a time portal. The red-brick houses are narrow and built right next to one another; they have small porches, and I see a window sign declaring that We are all made in God’s image. There is a cozy, nostalgic compression to the neighborhood, some sense of Americana that is absent from the busy streets of Manhattan, where I live. I see errant crocuses defying the angry winter wind and a daffodil or two, flags, rusty porch swings, and broken children’s wagons on tiny front yards.

“Where are we going?” my mother asks.

“To your daughter’s house,” I reply. “We’re almost there.”

“That’s right,” my mother says. “I know I have to go back tonight, to the place, but I don’t want to think about it now.” She smiles and plays with the electronic bracelet on her arm, unaware of what it does.

I bring up the river, again, out of habit.

“There’s a beautiful view from your room, Mom,” I say, finding my own smile sinister.

“Yes — ” she starts. “I’m so lucky.”

Another universe unfolds inside my sister’s house. It is organized along the principle of family: children’s bedrooms, toys organized by age appropriateness and size, a kitchen stocked with packaged soups and treats young children like. Still, the wreckage of six children and two dogs is everywhere in evidence: chewed pillows, fights underway, dirty dishes on the table, crayons on the floor. My mother settles on the couch after asking if she can be of any help. She smiles at me, looks around for my sister, asks where the house is. I am not sure how to answer. I tell her it is down the street from where she lives, but this means almost nothing to her. We are supposed to give her something to do, something to occupy her hands and provide her with a sense of her necessity; I have read that all humans need this. Sometimes we do find tasks, but sometimes we lack creativity and tell her she should just relax. She cannot relax; there is nothing relaxing about perpetual confusion. She asks again if she can help. I ask her to pick up the crayons. I find later that she’s put them in the dishwasher.

Noam, who is 7, kisses his grandmother when it’s time for me to walk her back. He looks at her as though in love, and I wish this were all she needed, all anyone needed. I assure her that she will return to this house soon. Right now, it’s time to go. She sighs; I hear a whistle in her breath that betrays more than passing reluctance.

The clouds have drowned in the ink of a night sky; I sing a familiar tune and hold my mother’s hand as we walk back to the nursing home. I assure her I’ll be back soon. It jangles my heart, this wrongness, this dropping off, this dislocation of a family member, the exile imposed by decline. The lobby murmurs with electricity, the First Lady dolls stare expectantly from behind their glass.

It jangles my heart, this wrongness, this dropping off, this dislocation of a family member, the exile imposed by decline.

I punch in the code and we ascend in silence to the second floor. My mother suddenly squeezes my hand and tells me she loves me.

The air in here is stale as ever. Both of us feel the panic return.

I have an impulse to seize my mother’s arm and run. I want to bring her home and put her in bed and sing to her until she falls asleep. Instead, I pick out a nightgown and pat her head. She tries not to cry.

“I’ll see you tomorrow, Mom,” I offer.

“Will you?” she asks.

And I retreat — from her, from her pain, from mine, from the city on the hill.

I punch in the code and the humming box takes me down. First floor. Past the First Ladies. Out into the night air. Onto the BXM2 bus, which will carry me home to Manhattan.

* * *

Reality is a a series of universes with membranes loose and undulating. Where does one end and the other begin? Is there a wall between the perceptions of the “demented” and the rest of us, who retain enough memory to support a more continuous vision of our histories, our days, to support a logic dependent on past and future? Or is it more a window, or more alarming yet, a swinging door?

All realities exist near porous borders — for example, my mind has flown into fantasy as I mentally retrace in words the cavernous, tiny universe of my mother’s nursing home complex. I’ve fashioned it into a cave of steel, in the image of another reality, stolen from within the pages of fiction. The path between it and my sister’s house is now a time machine of my own construction, my sister’s street is a world built of images I see in the past of a country in a time before I was born. My mother merely wants to know whose house we are visiting; her “sane” daughter, her guardian, is dreaming of the way in which streets, houses, concrete walkways, and riverside high-rise complexes splinter and spiderweb outward into separate communities from the moment humans began interacting with time. Which of the two of us, my mother or I, is more connected to reality as we define it as an everyday convenience? Surely my mother’s questions are more practical, more connected to pragmatic concerns, than mine, which are based on hallucinatory impressions of time and space. I have fallen somewhere on this visit, stumbled and slipped into fantasy, allowed the home that keeps my mother safe to drive me close to madness. I am not sure where anything begins or ends anymore, not sure that anything does.

The bus is turning off the Major Deegan Expressway now, it rolls steadily down Fifth Avenue. The trees on either side, illuminated by a flush of light from a streetlamp, are bare and white and wild-limbed against the black sky. They incline toward the street, forming an archway under which we sail. There’s the time portal again, coming into view: the buildings lining the avenue, stately and majestic, are alive with the ghosts of the 19th century, one can see the horses and carriages clopping underneath the arboreal canopy, one can smell the pipe smoke and the dirt, see the women in full skirts hurrying across this boulevard two centuries ago.

How easy it is to slip into reverie, to slip across the boundary from one reality to the other, one fancy to the next, especially in this city. New York is known for its boundaries between rich and poor, but also for the suddenness with which the neighborhoods change. Swank lobbies with doormen line one block, graffiti-worn bodegas and chain link fences line the next. This is how fast Madison Avenue shifts between 94th and 96th Street.

My mother is confined in her cave of steel, trapped within the boundaries of her forgetfulness. I am trapped in my own universe of half dreams and meditations. Her worries are immediate and connected to the hardness of her reality, mine are existential, free to float philosophically above her everyday concerns.

We walk the halls of her city on the hill together, unavailable to each other, trapped under different kinds of glass.

My bus drops me at my West Side stop; I exit to the sight of hot dog carts and the scent of park-bench smokers, to the music of barking dogs and basketballs bouncing. Though now walking a familiar route home, the sensation of wandering does not abate. Maybe it is merely the tableaux of street life shifting and sliding past that evokes my dizzy sense of dislocation. I think it is something more, though — I walk as if chased by the wind, or worse. I slow down, speed up, ascend the stairs to my apartment; still I am pursued. I go to bed and dream of the city on the hill. It is now a castle in which old people turn young, wrinkles are smoothed into satin flesh, the people dance and flick their skirts and sing.

We walk the halls of her city on the hill together, unavailable to each other, trapped under different kinds of glass.

A river runs past this castle, and boats too, in which the people make their escape. I do not see which boat my mother takes, or who ferries it, but when I return for our next visit she is not waiting on her bed for me, gazing at an unknown vista. I am so very glad — when I arrive at her threshold — that this time I do not see her there. She has crossed the perimeter between her world and my reverie, traversed the undulating boundary between reality and fancy, between mother and daughter, between dementia and freedom.

But this — this is only in my dream.

* * *

Leslie Kendall Dye is an actress and freelance writer based in New York City.

* * *

Editor: Krista Stevens
Copy editor: Jacob Gross
Illustrator: Katie Kosma

Remembrance of Folks Past: A Reading List of the Stories We Tell

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories. 

In “The Depth of Animal Grief,” Carl Safina writes, “A researcher once played a recording of an elephant who had died. The sound was coming from a speaker hidden in a thicket. The family went wild calling, looking all around. The dead elephant’s daughter called for days afterward. The researchers never again did such a thing.”

How do we remember our dead? We hold funerals. We engage in rituals that celebrate a life and symbolize its worth. We build monuments — headstones, perhaps, or statues. And we do something else, something I’ve been thinking about a lot lately. To crib a line from Lin-Manuel Miranda’s tiny little off-off-off-Broadway theatrical experiment “Hamilton”: “Who lives? Who dies? Who tells your story?”

Who lives? We do.

Who dies? They do — as shall we.

And who tells your story? The living. And while we are among the living, it is our job (if we so choose) to tell the stories of those who’ve gone. I’ve been thinking (and writing) about death and endings rather often of late. Here are some lovely examples of obituaries and tributes, some chosen by me, some chosen by helpful friends.

1. “Anthony Bourdain and the Power of Telling the Truth” (Helen Rosner, The New Yorker, June 2018)

Helen was my editor when I did this death-focused piece about TGI Fridays for Eater. She’s consistently edited James Beard Award nominees and winners, and she’s been a nominee herself. Her piece about her pal Tony is beautiful. She gives a more-than-well-deserved mention to his longtime creative collaborator, Laurie Woolever. And boy, does Rosner ever land the dismount. What. A. Kicker.

2. “Remembering Mr. Rogers, a true-life ‘helper’ when the world still needs one” (Anthony Breznican, Entertainment Weekly, May 2017)

I met Anthony Breznican — a gifted writer who regularly creates illuminating stories about entertainment and entertainers — after we spent 15 minutes chatting at a mutual friend’s barbecue, comparing his luminous Italian-American wife’s family funeral practices to those of my own clan. It was around the time his wonderful Twitter thread tribute to Fred Rogers went viral.

In college in Pittsburgh in 2001, Breznican was going through a hard time. This essay, based on the tweets, tells his story of running into Fred Rogers on campus. Here’s a snippet of what happened at what Breznican thought would be the end of a brief, polite exchange.

That’s when I blurted in a kind of rambling gush that I’d stumbled on the show again recently, at a time when I truly needed it. He listened there in the doorway. When I ran out of words, I just said, “So … thanks for that. Again.”

Mr. Rogers nodded. He looked down, and let the door close again. He undid his scarf and motioned to the window, where he sat down on the ledge.

This is what set Mr. Rogers apart. No one else would’ve done this. No one.

He said, “Do you want to tell me what was upsetting you?”

The rest is more than worth your time, neighbor.

3. “Colonel Michael Singleton” (The Telegraph, January 2003), suggested by Neil Gaiman

I ventured through the thickest wood, o’er hills and across rickety wooden spans under which dwell only the very sexiest bridge trolls (they have never heard of the internet and will eat you if you try to explain it) to climb a talking tree atop a mountain and whisper a single word into the ether: “Gaiman.”

This, as most people know, is the only way to contact Neil Gaiman. He then sent a fox riding an owl riding an elephant riding a second, extremely annoyed fox, all of them inside a hot air balloon basket, and they appeared after two days (during which time I had to urinate on the talking tree, who had some pretty colorful thoughts to share about that), and then the owl opened its mouth and dropped a piece of paper, which had the URL for this obituary on it. I borrowed the tree’s iPhone to read it and boy, did we smile!

Colonel Michael Singleton ran a boys’ prep school and was of the philosophy that young men “should be neither cosseted nor cowed,” which is as great a recipe for raising a decent human as ever I’ve heard. I’m not 100 percent on board with all the Colonel’s methods, but I admire his sense of politeness: “Knocked unconscious during action in Holland, he was saved only when a family emerged from a farmhouse cellar to drag him inside. In peacetime he returned to thank them and was delighted to be reunited with the field glasses which he had mislaid in the blast.” He was also wounded three times in battle. Later, he was appointed a Commander of the British Empire by Queen Elizabeth.

There’s a lot more, but not too much, and I think you’ll enjoy it.

4. “The most awful kind of grief. The most beautiful memories. So long, son.” (Chris Erskine, Los Angeles Times, March 2018), suggested by Carrie Seim

My friend Carrie is a journalist who has been writing for years about all sorts of things; since journalists read a lot, I figured she’d be able to suggest a powerful example of this type of writing. And she sure did. I can’t imagine writing something like this, and yet I can, just a little bit, because writers write through pain. It’s one way that can help. Sometimes it exacerbates the agony but usually it helps – sometimes because our words end up helping someone else, who tells us so. That’s the greatest honor a writer can claim, I think.

5. “Eloquent Barbara Jordan: A Great Spirit Has Left US” (Molly Ivins for Creators Syndicate, January 1996)

“Barbara Jordan, whose name was so often preceded by the words “the first black woman to . . . ” that they seemed like a permanent title, died last Wednesday in Austin. A great spirit is gone.”

Hell of a lede. But then, it’s Ivins, who specialized in ledes, kickers, and everything in between. She catalogues Jordan’s magnificent life of public service, sure, but she also gives us personal gems:

Jordan’s presence was so strikingly magisterial that only her good friends knew how much fun she could be in informal situations. Before multiple sclerosis crippled her hands, she loved to play guitar, and she loved to sing to the end of her life. Jordan singing “The St. James Infirmary Blues” was just a show-stopper.

Barbara Jordan was the first black person from the South elected to Congress since Reconstruction. But she was a lot more than her resume, and Ivins gives us a glimpse at Barbara Jordan, musician and friend.

6. “Molly Ivins, 62; humorist who targeted her wit at the powerful” (Elaine Woo, Los Angeles Times, February 2007)

I love Molly Ivins — not personally, as I’m sad to say I never met her. But when I was a teenager in the late ‘90s, her work furthered my love affair with political humor, a love that began when I was a mere kid reading my grandparents’ Art Buchwald books. Here’s Elaine Woo on the final days of Molly Ivins:

In her last weeks, she devoted her waning energy to what she called “an old-fashioned newspaper campaign” against President Bush’s plan to escalate the Iraq war. “We are the people who run this country. We are the deciders,” she wrote in her last column two weeks ago. “And every single day, every single one of us needs to step outside and take some action to help stop this war.”

What would Ivins have to say today about the Trump administration’s policy of ripping families apart at the border? I have a feeling that, with some small edits, it would look much like what she wrote above.

I miss her, I miss her, I miss her.

* * *

Sara Benincasa is a stand-up comedian, actress, college speaker on mental health awareness, and the author of Real Artists Have Day JobsDC TripGreat, and Agorafabulous!: Dispatches From My Bedroom. She also wrote a very silly joke book called Tim Kaine Is Your Nice Dad. Recent roles include “Corporate” on Comedy Central, “Bill Nye Saves The World” on Netflix, “The Jim Gaffigan Show” on TVLand and critically-acclaimed short film “The Focus Group”, which she also wrote.

Editor: Michelle Weber

Nell Battle Lewis, Storyteller for Jim Crow

Getty

Elizabeth Gillespie McRae | Excerpt adapted from Mothers of Massive Resistance: White Women and the Politics of White Supremacy | February 2018 | 19 minutes (5,394 words)

In the late fall of 1923, a young Nell Battle Lewis decided to spend an evening at the Superba Theater in downtown Raleigh, North Carolina, watching Birth of a Nation for the fifth time. Reviewing the film in her Raleigh News and Observer column “Incidentally,” Lewis noted that each time D. W. Griffith’s movie came to town, she had to see it. This was her sort of “religious observance.” Birth of a Nation, she wrote, was “the best movie we’ve ever seen.” It made her weep and drove her to exclaim, “This is my native land.” She went on to claim that the first KKK was “a necessary tour de force effected by some of the leaders of a . . . civilization in danger of its very life.”

Her devotion to such a film at first seemed incongruous. Lewis had returned to her hometown after years as a southerner living outside the South. After a brief stint at Goucher College in Maryland, she attended and graduated from Smith College in North Hampton, Massachusetts. At Smith, she sat in integrated classes, heard black and white political leaders, debated woman suffrage, and studied a curriculum that challenged the conservatism, reactionary impulses, and, to some extent, segregated and sectarian currents of the South. After a year in Manhattan, she had gone to France as part of the YMCA’s “Y-Girl” program to support the American Expeditionary Force. In 1921 Lewis had returned to Raleigh and interviewed with the News and Observer editors while dressed in jodphurs, a blazer, boots, and a hat. Her androgynous presentation gave pause to the editor, but he hired her anyway, as an embodiment of the “New Woman” — single, independent-minded, and career-oriented with world experience. As the newspaper’s first female staff writer, she set out to challenge the hidebound traditionalism of white southerners, pedestal-residing white women, and greedy industrialists. In economics, she rejected the trappings of the New South creed and disdained the materialism and business practices of the textile industry. In her early politics, she seemed to identify more with white women of the working class than those like her former St. Mary’s School classmates. Instead of joining the Daughters of the American Revolution and preaching Americanization and anti-immigration, she made fun of their reactionary politics and condemned their red-baiting. Opposing evangelical Christians, she parodied creationists and defended the study of evolution. When H. L. Mencken pronounced the South “the Sahara of the Bozart,” Lewis expressed her intellectual alliance with him, noting that he was “a heady stimulant . . . and effective purgative for intellectual inertia and dry-rot complacency.” As her prominence grew, southern commentators called her an iconoclast and a radical. Her enemies called her a communist; her father and brothers characterized her as abnormal, eccentric, and perhaps even mentally unstable.

Considering the widespread influence of the second Klan, her relentless attacks on them might have merited such judgments. A national organization with professional fundraisers and advertising executives, the KKK proclaimed Anglo-Saxon superiority, recruited record numbers of members, sponsored candidates for southern legislatures, and intimidated their political opponents. More than a few southern leaders lacked the moxie to publicly condemn the Klan, yet Lewis castigated them for their contribution to mob justice and racial violence and told her readers that the KKK was ignorant of the very race science it claimed to follow. In her published poem, she ridiculed their cowardice and intolerance in her opening stanza: “The Kautious Klan Klandestinely. . . . Kwarrels Konstantly with those; Who Kannot Like their Kourse DesPotio.” When the Klan threatened to send one of its female members to take Lewis’s job, she gleefully wrote of her anticipation and then attacked them for their criticism of professional women and flappers. She deplored most of all that KKK activity put North Carolina in the company of its less progressive southern neighbors — Georgia and Alabama. Each time the KKK reared its ugly head, Lewis felt it testified to the failure of North Carolina’s white leaders who had promised a more humane, compassionate, and just state. Still, she wept through Birth of a Nation, a film that she knew the second KKK had exploited.

Lewis did not erase the black South or ignore black achievement…. In fact, the stories she wrote offered up both the black elite and the black folk, but such writing often served to educate white people about the appropriate ‘place’ of blacks and whites in a Jim Crow world.

Taken together, these seemingly dissonant reactions were in fact not anomalous but rather typical outcomes of Lewis’s work in the cultural production of white supremacist politics. As Lewis put pen to paper, she celebrated a world led by educated white progressives, white female reformers, and black elites and populated by oppressed white industrial workers and black southerners receptive to enlightened white leadership. In the News and Observer and other periodicals, she crafted public narratives that created a cultural landscape of a more “affectionate segregation.” Her fiction and non-fiction reinforced specific historical interpretations, invoked black stereotypes, and celebrated white liberals and exceptional black men and women. Her feature writing often highlighted white women who called on social reform for white and black North Carolinians, noting white women’s gendered affinity for cleaning up politics. She praised white and black progressives and condemned those who participated in racist violence and who justified the neglect systemic to racial segregation. Lewis did not erase the black South or ignore black achievement. For example, she celebrated the poetry of Harlem Renaissance writers, congratulated North Carolina’s black collegiate choral groups, and lobbied for state-run girls’ homes for wayward black youth. She also wrote a piece that attributed the impoverished state of the black neighborhood Haiti Alley to the suspect character of those who lived there and ignored structural poverty. When she returned from her travels, she celebrated seeing the first shacks of black sharecroppers because they told her that she was home, romanticizing economic outcomes of segregation. In fact, the stories she wrote offered up both the black elite and the black folk, but such writing often served to educate white people about the appropriate “place” of blacks and whites in a Jim Crow world. In crafting her narratives, she encouraged her readers to follow cultural practices that reinforced racial segregation. She was a storyteller for Jim Crow.

In telling these stories, Lewis did important political work for the segregated South. Culture was one of the central levels where everyday experience could be translated into support for the larger social system, joining social welfare policies, educational practices, and electoral politics as critical sites where the Jim Crow order was shaped and sustained. Her writings offered a template for segregation to be modern and long-lasting — a system grounded in new cultural and scientific arguments more than older biological ones. For Lewis, North Carolina’s segregated order would be a product of a progressive state that adopted national reforms. Educated, liberal white supremacists, not mean reactionaries, would control race relations and mitigate the worst abuses of the system. Relying on the “best” white people, Lewis was a female counterpart to Howard Odum, who, as historian Glenda Gilmore noted, served as one of the “hydraulic engineers at Jim Crow’s watershed” urging white liberals to be the engines of gradual incremental change. With so many stories of mean-spirited and violent segregationists abusing black women and men, rarely did Lewis or Odum or progressives nationwide have to confront how their liberal reforms reified racial inequities. A broad agreement on white supremacy among white social reformers meant that Lewis could easily balance her progressive ideas with her devotion to a society of white over black. To her readers, she delivered lessons on a racial etiquette that upheld racial segregation, gendered ideas about female citizenship, paternalism, and devotion to social reform. For all the stories she told celebrating North Carolina’s enlightened race relations, she served the Jim Crow order by suppressing those that challenged the authority of liberal-minded, middle-class, educated white men and women. Lewis knew that the segregated order was never as secure as it might seem. White people needed instruction in how to maintain white supremacy. White apathy and white misuse of racial authority threatened the very system that guaranteed their political, economic, and cultural authority. In the 1920s and 1930s, her stories criticized the way segregation as practiced departed from the way she wanted and believed it should be. Right up to 1954, Lewis kept calling on fellow white southerners to live up to separate but equal, not abandon it.

Lewis’s brand of white supremacist politics clearly took root in the particular conditions of her home state where she could bring her beliefs in progressive era reform, modern science, eugenics, and women’s civic participation to bear on her work for racial segregation. North Carolina’s champions held the state apart from the racial violence of the Deep South, advertised its black educational institutions, embraced voices that challenged the material greed that undergirded the New South creed, and condemned the rawness and rage that characterized other southern demagogues. Politically, a relatively active state government had earned North Carolina its progressive reputation. Throughout the 1920s, rising public expenditures for state services inspired broad political discussions on economic development, social welfare, and education. Some white political and religious leaders even talked about improving black facilities, held interracial conferences, and welcomed black participation in a community of Christian humanitarianism. For the state’s leaders, North Carolina’s black population of nearly 30 percent figured in their vision of the state, where black moderates like James Shepard, president of North Carolina College for Negroes, could urge black North Carolinians to challenge inequality gradually and cautiously, exemplifying the “politics of respectability.” Josephus Daniels, once an architect of the 1898 white supremacy campaigns, owned the News and Observer, which served as a voice of moderation and modernization. The University of North Carolina at Chapel Hill recruited to its faculty such luminaries as sociologists Howard Odum and Guy Johnson and moved to national prominence under the leadership of Harry Chase and Frank Porter Graham. Progressive reformer Kate Burr Johnson headed the state’s Bureau of Social Welfare. In the interwar period, Bertrand Russell, Gertrude Stein, James Weldon Johnson, Langston Hughes, Frances Perkins, and Eleanor Roosevelt spoke at the University of North Carolina or Duke University, bringing some of the cosmopolitan energy Lewis had experienced in Manhattan and France.

At the News and Observer, Lewis first contributed feature pieces, edited the Society Page, and wrote a children’s page. Despairing at the limitations of these forums, she nevertheless made her first mark in “Kiddies Corner.” In this full-page feature, Lewis encouraged literacy and imagination, reinforced the social order with black dialect stories and caricatures, and promoted the study of North Carolina history. An early story entitled “Patrick, the Rollin’ Possum,” was written in dialect and included a Nell Battle Lewis original cartoon with the caption: “then the n****r held Patrick up by his long skinny tail and said: Ef dis heah’ possum ain’t sho’ nuff fat, den I dunno fat w’en I sees hit.” The next week, she encouraged young people to have their mothers read to them about their home state so they would “not only . . . feel that North Carolina is the best State, but to know why it is.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Soon she introduced her weekly column “Incidentally,” which would run almost uninterrupted for the next forty-five years. Prophetically, her column began with a scene in a park, depicting two black men and one black woman whose “contented laughter broke forth frequently, and the red meat of the melon disappeared rapidly.” Later her caricatures acknowledged the calming comfort offered by “deferential Negroes who wave to you even when they don’t know you.” Contented black North Carolinians joined Lewis’s frequent romanticized depictions of black-white relationships embodied in her print tributes to “mammy.” She noted that the ties between mammy and her white children were “more than imaginative gossamer,” as she lamented a system based on paternalism that was “now passing with the changing times.” In return for their loyalty and love, Lewis said that mammies would receive no earthly reward but the same spiritual reward “as the white folks they worked for.” In fact, the mammy of her childhood, she claimed, “came as near being a Christian as anyone who ever lived.” For Lewis, “Mammies” embodied the epitome of black leadership — serving in a position of deference, devotion, and dependency to white middle-class women. While she attacked her state’s social ills, she had established her column by trotting out minstrel-like black characters that assured herself and others of the satisfaction of the state’s black population. Under the helpful hands of the state’s white progressives, Lewis believed, black North Carolinians would take childlike steps forward.

Her writings offered a template for segregation to be modern and long-lasting — a system grounded in new cultural and scientific arguments more than older biological ones. For Lewis, North Carolina’s segregated order would be a product of a progressive state that adopted national reforms.

But as Lewis paid homage to the Mammy in print, she was participating in a larger cultural production of white supremacy in which the iconic black domestic took center stage. In the immediate aftermath of the 1922 dedication of the Lincoln Memorial, the UDC’s Washington, DC, branch gained congressional support for a granite tribute to black mammies. Mississippi’s Senator John Sharp Williams proposed and received appropriations of $200,000 for it, and North Carolina’s Charles Stedman introduced the funding bill to the House of Representatives. At the peak of its membership, the UDC seemed poised to build a monument that imposed its historical interpretation on the national cultural landscape. Some black newspapers responded with outrage. Newspaper owner, editor, and art historian Freeman Henry Morris Murray argued that “public sculpture was not merely reflective . . . but also productive of new publics and power relationships.” Encouraging his readers to be more critical in interpreting the meaning of sculptures, he asked them to evaluate “its obvious and also . . . its insidious teachings.” Black newspapers published their own renditions of a mammy statue that spoke to sexual aggression and assault coupled with long hours and no wages. For the UDC, the Mammy monument offered a racialized household that put white women in positions of authority, allowing them “to recast their own citizenship” and create a more “affectionate segregation.” While the monument never materialized, “mammy” did not need to be cast in bronze to function as an important symbol of segregation. Inked in Lewis’s columns, she remained both important and politically flexible in propagating the cultural infrastructure of segregation.

Lewis did not just deliver black characters of white mythology in her storytelling but also offered up black literary luminaries and black educational leaders. Lewis had long noted that she read the NAACP paper, The Crisis, and celebrated the artistic achievement of “Negro poets” like Claude McKay and James Weldon Johnson. Her favorite Harlem Renaissance novelist was Jessie Fauset, whose upper-class African American characters condemned passing as white and interracial marriage, themes that would have fit well with Lewis’s belief in eugenics and white supremacy. Lewis’s book reviews also upheld a racial hierarchy. In 1924, Lewis wrote a joint review of Walter White’s A Fire in the Flint and E. M. Forster’s A Passage to India, declaring that Forster’s work was art and superior in form and tone to White’s A Fire, “a more melodramatic piece along the lines of propaganda.” With omissions and exaggerations, White’s book, she claimed, made for a biased treatment of the “Southern White” and the “Southern Negro.” Like Forster’s work, there were similarities in the ruling people of each area who did not understand the colonized — blacks or Indians. She also saw parallels in that the rulers were ruling for “their own good,” not the common good. What bothered her most, however, was that “the Negro mind,” which she assumed to be distinct, appeared in White’s book as “not one whit different from that of the white man.” White’s black man acted just like a white one would under similar circumstances. “Can the Negro author who speaks for his race in this novel give us something more distinctive than that? . . . With all the mystery of Africa and all the darkness of slavery behind him, is there nothing unique in the Negro, after all?” she asked.

Lewis’s question exposed the cultural and geographic underpinnings of her racial ideology. Proud of her association with social reform, informed by scientific data, and assured of white women’s authority because of their particular racial and gendered identity, Nell Lewis rejected the pedestal and the pulpit but believed in Anglo-Saxon superiority. She rooted her hierarchical beliefs in “race science,” a position superior to those southerners whose racism rose from raw emotion. To educate her readers, she ran a crossword puzzle about eugenics, celebrating modern scientific thought. But as her review of White suggested, her racial liberalism left no space for discussions of an equality born of commonalities. Modernism had educated her, and there were differences — biological, cultural, historical differences — she believed, that should shape public policy and culture. It was not anti-modernism or economic gain that drove her racial politics, but a Progressive Era devotion to social reform, women’s gendered contributions to society, and modernity itself.

While Lewis’s attention to black accomplishments reflected a kind of racial moderation to both her white readers and her black readers, it simultaneously stung some black readers. In the winter of 1925, she attended a production of Shakespeare’s Twelfth Night put on by the Shaw University Players. Despite the technical perfection, Lewis noted that “the general effect of the performance was strikingly artificial.” Instead of Shakespeare, which black students must perform, she claimed, in their “adopted language,” she advised them to focus on folk drama. While the KKK had carried “racial consciousness and racial pride . . . to excess,” she conceded, “I am a great believer in trying to be what you are.” Lewis advocated an emphasis on “their own distinct racial character.” Lamenting that the “advancement of the Negro has been largely imitative,” she was anxious to witness “a genuine drama of their own.”

Willing to engage with her critics, Lewis published the objections of two black North Carolinians who lamented how white supremacist ideology infiltrated her public narratives. Shaw University dean William Turner appreciated her “to some degree complimentary criticism” but disagreed with her assessment of English as an adopted language for African Americans. He instructed Lewis that black and white babies learn language in the same way and that there was no “racial predilection for any particular language.” Black social heritage in the United States, he continued, was the English language. At the State Department of Public Instruction, W. A. Robinson also noted that her comments solicited much discussion among those who “admire your usually broad attitude toward thought in general and concerning the Negro in particular.” He also disagreed with her suggestion that black Americans just imitated white Americans, noting that black Americans had long legacies of their own American traditions.

Two years later, Lewis again sparred with her critics after she reviewed black musical performances at the governor’s mansion. When black performers sang “Negro-folk songs,” Lewis praised them because they “sang like Negroes.” In the middle of “Cotton need a-pickin so bad,” the Fayetteville singers even “did a little shuffle . . . exactly right,” she wrote. This time a University of North Carolina professor reminded her that the “cultured Negro . . . is not the freedman of 1867.” Eavesdropping on a conversation about her review among black college girls, he heard them comment that “the white audience had a taste for music that was satisfied in direct proportion as the program descended toward more clownish setting.” For Lewis, the Jim Crow South meant black southerners occupied a particular cultural place, and this meant deference, dialect, and slave spirituals, not Shakespeare, “correct” English, or political participation. Her reviews and accompanying criticism reminded her readers — both black and white — that white supremacy reigned even among white southern liberals.

Lewis knew that the segregated order was never as secure as it might seem. White people needed instruction in how to maintain white supremacy…. In the 1920s and 1930s, her stories criticized the way segregation as practiced departed from the way she wanted and believed it should be.

Lewis’s views on social reform, however, held some real possibility for positive changes to the justice and prison systems. She worked together with Howard Odum and the Journal of Social Forces to publicize reform proposals for mental health and penal facilities. This work connected her to nationwide efforts that rooted reform in social science research and simultaneously reified an American racial hierarchy. Condemning capital punishment for those suffering mental disabilities, Lewis wrote about “a lone man behind the grim gray walls of the State’s prison, with a pitifully jangled brain [who] will pass swiftly and mercilessly and forever into death’s dark silence.” In 1925, she told her readers how prison guards murdered a “mentally defective Negro prisoner.” Lewis blamed this state-sanctioned killing on politicians who cared more for the bottom line than prisoner well-being, an impulse that also shaped an unwillingness to fund a segregated institution for the “feebleminded.” Thirsty for revenge, state officials would rather have a rape trial and lynching of a black man “with a mind of a 10 year old,” Lewis wrote, than “provide adequately for the mentally ill.” Lewis was incensed that “mental defectives” — particularly those who were black — were often left in society to commit crimes and then put to death without ever receiving treatment. Lewis argued that without the “exercise of disinterested public spirit and intelligence” that might consult sociological rather than economic studies in the pursuit of a fair and just legal and penal system, the state’s political leaders would fail to uphold North Carolina’s progressive image. Subsequently, Lewis feared that North Carolina would never rise above the South’s reputation of “savagery” and “backwardness.”

Her outrage about capital cases of mentally ill prisoners in 1921 and 1925 coalesced in her study entitled “Capital Punishment in North Carolina.” Full of data about age, region, race, economic standing, and crimes of those put to death by the state, her research connected her to the American League for the Abolition of Capital Punishment (ALACP) and the work of its secretary Vivian Pierce and lawyer Clarence Darrow. Pierce praised Lewis’s report on capital punishment as unmatched and asked her for permission to publish parts of the report. While Lewis worked with the League and other reform organizations, she did not join the ALACP, the southern-based Commission on Interracial Cooperation, or the Association of Southern Women for the Prevention of Lynching (ASWPL). In 1930, when a black man was lynched for the alleged rape of a white girl in Edgecombe County, Lewis did not sign the petition circulated by the North Carolina ASWPL. She did write a blistering article that blamed South Carolina’s former senator Coleman Blease, known for inciting racist violence among the white working class, for the particular brand of vitriolic racism now circulating in her home state. She criticized the barbarity of a mob that took no account of either the evidence or the mental condition of the accused. Lewis worked closely with white female reformers, public health officials, and the League of Women Voters to upgrade mental health facilities, youth reformatories, and prisons, and to make the state’s judicial system administer justice that met the spirit of separate but equal. From this liberal political platform, Lewis managed to continue to craft North Carolina’s position as a progressive southern state even in its commitment to racial segregation.

* * *

In 1931, the editors of the Chapel Hill magazine, Contempo, Lewis’s friend Paul Green, and social scientist Guy Johnson invited Langston Hughes to the University of North Carolina for a reading of his scathing poem “Christ in Alabama,” about the false accusations and shoddy trial of the nine Scottsboro boys. Hughes came to town, read poetry, and charmed many Chapel Hill residents, simultaneously earning the ire of industrial and political leaders across the state. While Nell Lewis applauded academic freedom, her brother, Kemp Lewis, led a campaign to punish those who sponsored Hughes. He wrote to UNC president Frank Porter Graham claiming that Hughes’s poetry, particularly the poem he referred to as “Black Christ,” was “enough to make the blood of every Southerner boil to have a man like this . . . given any attention or consideration whatever by decent white people.” Kemp Lewis asked “if this Negro was allowed to use the buildings” or if he had “any recognition whatever by the faculty?” He then questioned Graham about the students who authored Contempo and accused them of “striking at the very foundations of our civilization and our social relationships.” Not satisfied with alerting only Graham, Kemp Lewis proceeded to notify Governor O. Max Gardner and included clippings of Hughes’s poetry in his letter. He then asked the governor to speak to Graham about this attack on white supremacy.

The turmoil over Hughes alerted the state’s white elite to “subversive” activity at their university. By early 1932, more than 300 people had signed the Tatum Petition that called on Graham to curb “the alleged evil influences of the University of North Carolina upon the youth of the State.” Though convalescing from oral surgery and bouts with mental illness at Tucker’s Sanatorium in Richmond, Nell Lewis did not let this attack on academic freedom pass silently. She wrote her brother Kemp that she hoped “all is well at the University” and asked “Is ‘Contempo’ still uncensored?” “I wish you would run David Clark out of that State,” she continued, as he was “behind that petition . . . as sure as the world, and is nothing but a public nuisance.” Kemp Lewis did not sign the Tatum Petition, but he continued his protest and broadened his attack to include the university’s leniency on socialism. In her weekly column, Lewis ridiculed the Tatum Petition, describing it as “foolishness, just plain foolishness — I don’t care how many mayors, ministers, and manufacturers have signed it.” She defended the presence of both Russell and Hughes and claimed sarcastically that “although that [the Hughes visit] was in the ticklish realm of race relations in the South, lynching still seems to me out of order.” While Kemp continually referred to the “nausea that came to me over the Langston Hughes incident,” Nell Lewis wrote, “Black or white . . . Hughes is a poet and like it or not, his works are part of current American literature.”

When Lewis returned to health and to North Carolina, she became less vitriolic in her calls for reform and more indebted financially to the very brothers she had excoriated. The cultural landscape of white supremacy that she continued to shape from her columns, however, was not decidedly different than before, even with the New Deal. She still condemned racist violence and an unresponsive judicial system, and she upheld what she believed could be a sanctified and responsible system of white over black. Far from challenging this position, architects and leaders of the New Deal helped her cultivate this space for social reform in the hands of an enlightened white elite. Thus, Lewis’s friend Frank Porter Graham could belong to the Southern Conference for Human Welfare and deny Pauli Murray, an NAACP member and civil rights activist, admission to University of North Carolina’s graduate program in social work. Even as African Americans realized the subversive potential of the New Deal, liberal white supremacists, like Lewis, saw few national challenges to southern race relations from the federal government, the Democratic Party, or black southerners.

While many North Carolinians and students of the 1920s would remember Lewis’s radicalism, advocacy for industrial reform, and opposition to the region’s most reactionary moments, her most long-lasting work had been in the cultural production of white supremacy.

She still worked to expose her state’s failures to meet the equal part of separate-but-equal and attacked reactionaries who condoned exploitative and cruel public policies. Lewis’s commitment to prison reform and her public commentary on the deplorable conditions faced by the state’s black and white incarcerated demonstrated that she still had room to critique the implementation of white supremacy without threatening its foundation. She exploded with characteristic fury and sarcasm when two black prisoners, Woodrow Wilson Shropshire and Robert Barnes, lost their feet to gangrene. Sentenced to “serve short terms” on the state highways for larceny and drunk and disorderly conduct, respectively, Shropshire and Barnes suffered frostbite after being “hung up” in marginally heated cells during twenty-degree nights. After nine days of such treatment, they worked eighteen days in the prison camp until they received medical treatment for “the flesh of their gangrenous feet rotting and dropping off the bones.” At ages nineteen and twenty, the two black men had their feet amputated and were left crippled. When the case reached the courts, the unfairness of the judicial system compounded the tragedy, reinforcing how Jim Crow courts equaled injustice. The jurors failed to find the guards and the prison physician guilty of cruel and unusual punishment. Lewis claimed that this case revealed how African Americans were often denied the right to ask for justice in the state’s courts. Lewis noted that the state-appointed attorney presented a lackluster case for the prosecution. Even though an indictment could not help the prisoners, she noted that it could have shown them that justice was available to African Americans in North Carolina. Instead, she claimed, the trial “actively says to them — and to an admiring world . . . Just a couple o’ n****rs — so we should worry.” Taking an even sterner stand, Lewis proclaimed that black North Carolinians had not “a ghost of a chance in its [the state’s] white man’s courts . . . because they were poor Negroes without influence.”

Read as a defense of black civil rights, Lewis’s condemnation of prison abuse would earn her a place among some of the most liberal activists of the 1930s. The all-white court system — a product of segregation — was partially to blame, contended Lewis. This was a bold assertion in 1935; it was not a damning one. For Lewis, whites failed to uphold a legal system that guaranteed their superiority, not their infallibility. Segregation laws did not prohibit a just conviction of white criminals. The white prison guards and physicians deserved jail time for their crimes and for compromising the myth of white superiority. Whites had failed to uphold the law and in doing so had threatened the entire rationale of white supremacy. In failing to carry out its legal responsibility, the courts of North Carolina, not Lewis’s critique, jeopardized the system of racial segregation. In fact, she was all too aware that incidents such as these earned her beloved state the condemnation and condescension of outsiders and perhaps threatened to incite the spirits of the state’s black citizens.

Her blistering attacks fell short of condemning racial segregation. Neither did she support the Southern Committee for People’s Rights, a Chapel Hill group led by her friend Paul Green and other white radicals who called for the dismantling of racial segregation. Lewis’s commitment to social reform did not apparently push her this far. The committee rebuked the system and also defended the rights of the prisoners as individuals. In advance of a national discussion, they spoke of human rights and tied their efforts to those working for African American civil rights. Lewis did not adopt the human rights discourse but maintained a tone of parental remorse and paternalistic regret when she affirmed that even in the face of injustice, “it seems to me that the Negroes of this State, as a whole, are remarkably well-behaved, remarkably patient.” In her open statement to North Carolina’s black population, she reassured them that “many other white people in North Carolina are shamed by this verdict . . . [and] we consider it a disgrace to the State.” She admitted, however, that her “many” was really more like a few.

* * *

While many North Carolinians and students of the 1920s would remember Lewis’s radicalism, advocacy for industrial reform, and opposition to the region’s most reactionary moments, her most long-lasting work had been in the cultural production of white supremacy. Carefully balancing her political radicalism in other areas with a relatively liberal position on segregation, Lewis had emerged as an incisive storyteller for segregation and the political project that undergirded it. Her reputation as a “truth-teller” only reinforced the lessons she offered about white over black in the Jim Crow South. Her racial politics also offered educated, progressive white southerners a politically palatable way to digest the politics of white supremacy. Lewis was not out of step with more progressive views of women’s political activism. Her efforts connected her to reform projects across the nation — prison reform and social science-based policies hatched in universities across the nation and published in academic journals. Rooted in this modern political context, she offered white southerners stories to take them forward in terms of the white supremacist political project.

* * *

Elizabeth Gillespie McRae is an associate professor of history and director of graduate social science education programs at Western Carolina University.

Editor: Dana Snitzky

Etta or Bessie or Dora or Rose

AP Photo / CSA-Printstock, Photo illustration by Katie Kosma

Elisa Albert | How This Night Is Different | May 2018 | 23 minutes (5,706 words)

October 2004

Dear Philip,

You must be aware of the intimidation factor inherent in anyone’s writing to you, but I wonder if maybe the paradigm is similar to what happens when a stunning woman walks into a room: no one approaches her, she’s simply too beautiful; everyone assumes they have no shot. Maybe you don’t get many letters. Maybe you haven’t received a truly balls-out, bare-assed communiqué since 1959.

You once signed a book for me. That’s the extent of our connection thus far, but it’s something, isn’t it? The book was The Counterlife, but I had yet to read it when I presented it to you for signature. You were unsure of the spelling of my name, and so there’s an endearing awkwardness, a lack of flow, to the inscription. For E, you wrote, and the pen held still too long on the page, leaving a mark at the point of the lowest horizontal’s completion while you waited for me to continue spelling. L, you continued on, and then, again, a spot of bleeding, hesitant ink before the i and the s and the a, which proceed as they should before your slanted, rote, wonderful autograph. I remember being all too aware of the impatient line behind me, people clutching their copies of Portnoy’s Complaint, Goodbye, Columbus, The Human Stain, the odd Zuckerman Unbound. I tried to meet your eye, I tried to communicate something meaningful. The others, of course, didn’t get it. I wanted you to know: I got it. Later, when I found my way to reading the book, I actually purchased a whole new copy so I wouldn’t sully my signed paperback. I cherish our moment of eye contact, your pen hovering over the title page, my name circulating in that colossal mind of yours.

But wait. This is no mere fan letter; no mere exercise in soft-core intellectual erotica constructed for your amusement. I have an objective. How old are you now, Philip? Early seventies, is it? You are, of course, notoriously private. I have the books, sure, like everyone else. And the reviews of the books, each of which mentions the notorious privacy. And there’s the Claire Bloom debacle, which I hesitate even to mention, given its complete disrespect of the notorious privacy (though you might be happy to know that I couldn’t find “Leaving A Doll’s House” in any of the four sizable bookstores I checked and had to finally order it on Amazon). And The Facts, which I made a point of reading after the Claire Bloom, for balance. A graduate school friend of mine was your research assistant for a few years while we pursued our MFAs and it took her almost a year of post-workshop drinking to slyly confess, to a rapt audience of salivating young writers, her association to you. (Otherwise you’ll be happy to know she was loyal; she professed total ignorance of your life, your private matters, even your address. She seemed, in retrospect, somewhat terrified of you. I half-seriously offered her boyfriend a blow job if he’d get me your address. The table of young writers giggled madly and took big sips of beer.)

Read more…

When Will Hip-Hop Have Its #MeToo Reckoning?

Kelis performs in Paris, 2014. (David Wolff-Patrick/Redferns via Getty Images)

In a recent interview with the celebrity news site Hollywood Unlocked, singer Kelis discussed her seven-year relationship with ex-husband, Nas, the legendary Queens rapper, with a level of detail she never had publicly. She described a mix of “intense highs and really intense lows,” including bruises from physical fights, alcoholic binges, cheating, and emotional abuse. Kelis also made claims that, since the divorce in 2010, Nas had been a difficult and unreliable co-parent to their 8-year-old son. At more than an hour long, the interview is a marvel of a testimony and rings with emotional honesty. Kelis seemed weary of keeping quiet about her past, saying she simply woke up and thought “not today.” Read more…

‘Like Floating Through a Library’: An Interview with Nick Paumgarten

Big Bend National Park (Education Images/UIG via Getty Images)

For a recent issue of The New Yorker, staff writer Nick Paumgarten floated the rugged canyons of the Rio Grande to witness the irreplaceable wilderness that Trump’s proposed border wall would destroy. A native New Yorker, Paumgarten fell in love with whitewater on Idaho’s Salmon River as a kid. Paumgarten’s feature, “Water and the Wall,” takes readers through the riparian heart of Big Bend National Park, in a flotilla that includes Teddy Roosevelt’s great-grandson and New Mexico Senator Tom Udall. Damming, diversion, pollution, and overpumping have long degraded the state of America’s rivers, reducing clean rushing waterways into canals with as much wildness as a pet store. Paumgarten’s story shows how heightened border enforcement poses a new environmental threat.

You mention you hadn’t given much thought to the Rio Grande before you began your reporting. Did that lack of knowledge vacuum hinder or help as you started examining the river? 

There’s something slightly Trumpian to the presumption that one’s own ignorance of a subject extends to the rest of the world. In this case, the knowledge vacuum lured me in, got me curious, and made the thing seem worth doing.

It’s always great to have people who know their way around a subject or a place and can fill you in. I was fortunate here to be on a trip with a handful of such people. It was because of them that I went on the trip, really. They had done the work so I wouldn’t have to. It was like floating through a library. I just had to pay attention and jot it all down in my waterproof notepad. (I learned pretty quick that it’s hard to take notes and steer a canoe at the same time.) On the other hand, I knew a little bit about rivers in general. I’d been on a bunch of float trips, paddled kayaks here and there, and had passed hours upon hours talking about rivers with other boaters. I’d read and loved Cadillac Desert and Desert Solitaire. So I brought something to this one. I usually like to have some point of contact, some toehold, when I set out to report a piece.

This boat trip let you return to the whitewater kayaking you did in your youth, and to make good on a promise you made to yourself about taking a rafting trip later in life This was a small personal thread in your article, but a powerful one. What was your logic for including a bit of the story of your life as a river runner?

No logic. Pure narcissism. Well, okay, maybe there’s a reason or two. As I said before, I like to have some kind of connection to a story. Sometimes that connection is personal. I read somewhere recently that John McPhee once tallied up all his stories and discovered that almost all of them had something to do with subjects he’d been interested in before he even went to college) This story was a mix of things and one of them was that it’s an ode to river-running.

A quiet theme here is that the impetus to protect rivers usually arises out of spending time on them. This seems true in a broader sense. (The demise of, say, the Great Barrier Reef is more painful to contemplate if you’ve been there to see it.) Many of the people in this story got religion on a river, and so maybe it made sense for me to describe how I had, too. Likewise, you can’t quite appreciate how absurd the idea of a wall is until you’ve spent some time in some of the places where one might go. Donald Trump and his cabinet ought to float the Rio Grande.

People need the chance to contemplate their existence in what you call nature’s “prehistoric hush,” to experience the cosmic out by a campfire. And yet, new sections of that absurd wall are being considered that would destroy that hush. How do you think of your role as a journalist to help stop these things?

I don’t really ever think of myself as an advocate for a point of view when I’m reporting and writing pieces. In this particular instance, that I think the wall’s a lousy idea. I also think rivers deserve as much protection as we can muster. But I didn’t take the assignment in order to advance those arguments.

Maybe I wound up doing it subconsciously, but my role, as I see it, is to bring things to light, and to present them in a way that makes you see those things in a new and different way. To the extent that there’s guile in the structure or in the emphasis, it may have more to do with keeping the reader interested, or maybe creating moments of insight and delight.

Did you read any classic river books before starting this trip? John Graves’ Goodbye to a River or Mary Morris’ The River Queen: A Memoir?

When I got out of college, I thought I’d be doing what we used to call “nature writing.” I’d been reading a lot of Edward Abbey, Aldo Leopold, Rachel Carson, John McPhee, Peter Matthiessen Gretel Ehrlich, Barry Lopez, Norman MacLean — all that stuff, which either was a thing in the early 1990s, or was a thing in the intermountain west, where I’d gone to live for a time.

Twenty-plus years back in New York City had beaten some of that out of me, or at least had caused me to forget that that’s what I was into. I have not read the two books you mention, though the Graves came up on the Rio. Add it to the list! To be honest, on this one, in the time allotted, I could barely take a big enough bite out of Paul Horgan’s history of the Rio Grande. I also had in mind my colleague Ben McGrath’s forthcoming book about Dick Conant, an itinerant vagabond canoeist and latter-day Huck Finn, which I’d seen some early chapters of. It captures a workaday riparian America that I hardly knew existed.

As a New York City native, what had urban life beaten out of you that the nature writing revived?

Returning to New York as an adult really just diverted me from thinking, writing, or reading about the outdoors, the American West, and the natural world. It was hard to get out.

I got a job at a weekly newspaper in Manhattan, the New York Observer. The focus was on people, the machinations and ploys of city dwellers. Culture, politics, business. The whole circus. Editors and readers generally didn’t seem to care much about timber rights or water flows or endangered species, or nights out under the stars. I got re-urbanized. I grew cynical about a certain kind of writing — overly poetic evocations of natural beauty, pat epiphanies out in the bush.

Meanwhile, as you get older, maybe you get more interested in questions of money and class, in the way generations rise and fall, who’s screwing over whom and how. But in the last couple of years, I’ve been on a few assignments and trips that have reminded me about what excited me when I was younger, and I’m sort of trying to figure out a way to get back to it. This Rio Grande trip was one of these.

Can you reconcile your interest in the West with your current location? How about a Talk of the Town department for a town on a trout stream?

Whenever I do the where-from-here math, I find that I still love this town, and for that matter the whole tidewater east. But who knows what tomorrow will bring. One thing it won’t bring me is new shoulders, so big-water kayaking ain’t in the cards.