“RG: Sometimes when people talk about women and the workforce, they say a woman cannot truly be equal to a man unless she has her own income. What do you think?
“Mom: Well. Equality. What a word. When we choose go outside in the world, when we come home, we’re still mommy. The second shift starts. Equality doesn’t exist, period, even when you share the chores. Some days it can be 70/30 and other days it is 30/70. I don’t think that’s what we should be fighting for.
“RG: What should we be fighting for?
“Mom: Men participating more in the home, but it’s petty to say 50/50, because life doesn’t allow that.”
–Roxane Gay’s interview with her mother about equality in marriage in The Hairpin.
* * *
Katy Kelleher | Longreads | June 2020 | 19 minutes (4,853 words)
In The Ugly History of Beautiful Things, Katy Kelleher lays bare the dark underbellies of the objects and substances we adorn ourselves with.
* * *
He wasn’t even two years old; a tiny thing, really, hardly even a person. Alfred was the ninth son of King George III and Charlotte of Mecklenburg-Strelitz, their fourteenth child. But his numerous siblings didn’t make Alfred any less beloved. Portraits of the boy show him as rosy-cheeked and handsome, with light eyes, a pronounced Cupid’s bow, and soft folds of neck fat. His royal parents loved him dearly, and when he died on the 20th of August, 1782, Queen Charlotte was said to have “cried vastly.” The king, too, was bereft. Later, when he went mad, he reportedly held conversations with his lost little boy and his brother, Octavius, who’d also died as a child.
Often, upon losing a family member, 18th century mourners would send the dead to their graves only after giving them one last haircut. They would harvest their locks to create elaborate weavings. Sometimes, the hair would be fashioned into floral wreaths. Sometimes, it would be made into jewelry. Frequently, the hair was plaited and pressed into lockets, which were then worn close to the heart. Prince Alfred didn’t have enough hair on his small blonde head for a weaving, but a tress did make it into a locket — a single soft curl. It sits behind glass, in a gold and enamel frame that displays the dates of his birth and death. The other side of the locket, a delicate piece of jewelry shaped like an urn, is decorated with seed pearls and amethysts. It is now part of the Royal Collection Trust. “Due to his age, there was no official mourning period for Alfred,” notes scholar and collector Hayden Peters at The Art of Mourning. “But his death came at a time of the mourning industry being a necessary part of fashion and a self-sustaining one in its own right.”
When it comes to mourning jewelry, there’s no piece quite like the locket. Whether urn, round, oval, heart, or coffin-shaped, it’s an item that telegraphs absence. I love is the message the locket sends. Or perhaps more accurately, I have loved. Even today, we understand that lockets are meant to show allegiance to someone who is not present, whether the loss is through death or just the general isolation of modern life. A grandmother might wear a locket with pictures of her far-away grandchildren. One half of a long-distance couple might keep a locket with a bit of their partner’s hair. I know a woman who wears a locket with a picture of her dead sister; she plays with it sometimes when she’s drifting in thought.
It’s a beautiful piece, but it’s impossible for me to divorce the beauty of the silver pendant from its significance. Once you know someone’s greatest wound, it’s hard to look at them the same way you did before. And once you know an object’s terrible provenance, it’s difficult to covet it without feeling at least a little guilty, a little angry at your own sinful schadenfreude.
Before the ritualization of mourning in the Victorian era, wearable containers were a discrete way to keep an item close, usually something that had significant personal meaning or an intimate purpose. These pendants, brooches, or rings were visible and sometimes highly ornate, but their contents weren’t typically meant for public consumption. As emotions have slowly become more public (and more performative), so too have lockets gone from being highly private objects to functioning as a means of displaying big sentiment in a socially acceptable way. Like generational trauma tap dancing through DNA strands, jewelry transports sentiment from one person to the next. It holds, in its tiny little chains and clasps, evidence of our most devastating emotions, from fear to grief to existential despair. It makes those things small, palatable, pretty. But in the shrinking of emotion, we run the risk of losing touch with the expansive and all-consuming reality of grief. We risk losing the opportunity to come together as a community, to hold not jewelry, but each other.
* * *
For as long as we’ve been aware of our bodies, we’ve adorned them. Adam and Eve donned fig leaves to cover their nakedness, and thus clothing was born. But we just as easily could have covered ourselves with other objects, for other reasons. It’s possible we wore furs to stay warm. It’s also possible we wore them to look cool. (We’ve come a long way, sartorially, from the hides-and-leaves days.)
If this conflates clothing and jewelry, it’s because the line between the two is actually quite thin. Clothing is typically made of fabric, leather, or fur, while jewelry is made of metal. Yet some jewelry is made of leather and fabric, and some clothing is made from iron and gold, so the difference isn’t about materials. It’s about function: Clothing covers and protects the body, jewelry adorns and enhances it. “Jewelry has been a constantly evolving product of its time for centuries, and looking at the styles of a particular age is a great way to discover where people’s heads were,” says jewelry historian Monica McLaughlin. “Over time, jewelry has served as a form of talisman or a personal item of reflection, as a way to support one’s country in a war effort, or as an outlet for people — rich or poor — to memorialize their loved ones or proclaim their latest enthusiasms, It really is a tiny, exquisite little window into history.”
I love is the message the locket sends. Or perhaps more accurately, I have loved.
The word locket, most likely derived from the Frankish word loc or the Norse lok, meaning “lock” or “bolt,” first appeared in the 17th century, but the concept of a diminutive, wearable container dates back much further. The earliest examples of container jewelry — a category that includes lockets, rings, bracelets, broaches, and even chatelaines, a kind of metal belt that allowed the wearer to carry keys, scissors, good luck charms, and a variety of small containers attached to one central decorative piece — come from the Middle East and India, though it’s proven difficult to tell exactly when or where the locket was born. Until recently, jewelry wasn’t as rigorously studied as other art forms, says Emily Stoehrer, jewelry curator for the Museum of Fine Arts in Boston. “Maybe it’s the materials,” she muses. Or maybe it has something to do with the newly gendered nature of jewelry (diamonds weren’t always a girl’s best friend, if you get my drift).
The Museum of Fine Art has built up a substantial jewelry collection over the past century. One of the MFA’s most popular and most written-about items is the Hathor-headed crystal pendant, a piece that has been dated to 743-712 B.C.E. It’s also the earliest example of container jewelry that I’ve found, though I strongly doubt that it was the first of its kind. Just over two inches tall and an inch-and-a-quarter wide, it consists of a hollow crystal ball topped with a tiny gold sculpture of a serene, long-haired Hathor. The goddess wears a headdress featuring a pair of cow horns and a sun disc. The woman’s face looks composed, kind, and brave — fitting, since she’s the deity of beautification, fertility, and a protector of women. Hathor, according to Geraldine Pinch, author of Egyptian Mythology, was “the golden goddess who helped women to give birth, the dead to be reborn, and the cosmos to be renewed.” Later, during the Greco-Roman period, she became known as a moon deity, and the goddess of “all precious metals, gemstones, and materials that shared the radiant qualities of celestial bodies.”
This pendant was found in the tomb of a queen who lived in Nubia. We don’t know what the crystal originally contained; the MFA website says it “probably contained substances believed to be magical.” Stoehrer doesn’t have much more to add, saying that it is “believed to have had a papyrus scroll inside it with magical writing that would have protected the wearer.” The mystery, she says, is part of the appeal. “People love the story of what might have been in it, what it might have said.”
According to Stoeher, wearable prayers and early receptacle jewelry were created around the globe, but were particularly popular in “non-western” countries; historians have found evidence that people in ancient India and Tibet carried magical wardings on their bodies, pieces of prayers and words for good luck. Christians eventually began to wear small containers holding devotional objects a bit later, sometime in the Middle Ages. But some devoted followers of Christ weren’t satisfied with writing down a few words of worship and calling it a day. Instead, they hoarded pieces of people, bits of bone and hair and blood.
Relics are one of the grisliest forms of Christian worship. Although the belief in relics, defined by the Metropolitan Museum of Art as the “physical remains of a holy site or holy person, or objects with which they had contact,” has been part of the religion since its beginning, the trade in relics truly began to pick up steam during the reign of Charlemagne. According to historian Trevor Rowley, the body of a saint could act as a stairway to heaven, providing a “spiritual link between life and death, between man and God.” Relics were typically stored in decorative cases called reliquaries. Made from ivory, metal, gemstones, and gold, reliquaries had places of honor in churches, monasteries, cathedrals, and castles. The most revered relics were objects that Jesus or Mary had touched or worn (including purported pieces of the True Cross, his Crown of Thorns, or scraps of woven camel-hair believed to have been worn by Mary as a belt) but there are plenty of relics that belonged to lesser figures, like saints. Many of these aren’t lifeless objects like shoes or hats, but bits of hands and arms and hearts and legs. (There are also secular relics, like three of Galileo’s fingers, on display at the Galileo Museum in Florence, or the supposed 13-inch-long alleged pickled penis of Rasputin housed at the Museum of Erotica in St. Petersburg, though these objects aren’t worshiped in quite the same way.) Since there are thousands of recognized saints in Christianity and it’s hard to tell one disembodied leg or desiccated kidney from another, there are a lot of possible relics out there to be unearthed, sold, and displayed.
Fascinating as these grim objects may be, they’re still less strange than the reliquaries once worn by medieval Christians. It’s one thing to inter a body in a church and allow visitors to pray over it on a Sunday, and quite another to take a fragment of finger bone, stick it in a tiny silver case, and wear it around your neck, but that’s exactly what people did. One personal reliquary housed at the British Museum, dated to 1340, is made from gold, amethyst, rock crystal, and enamel. Inside the colorful locket nestles a single long thorn believed to come from the holy crown. Many reliquaries held splinters of bone, though later analysis often found that the bone was unlikely to be from a saint (and sometimes wasn’t even from a human). Merchants sold reliquary pendants stuffed with teeth, hair, blood-stained fragments of cloth, drips of tomb oil, and other supposedly holy items. The practice continues to this day, but Peak Relic was during the Romanesque period, which ended around 1200 CE.
As the Middle Ages gave way to the Renaissance, container jewelry was used more and more often for mundane (and hygienic) purposes. There are many examples of people keeping scented materials in little wearable containers in attempts to mask their natural smells. Known as pomanders, from the French pomme d’ambre (apple of ambergris), these perfume balls were packed with musk oil, ambergris, and other less costly plant-based fragrances. The Metropolitan Museum of Art has ten in their permanent collection, including an incense ball from 13th or 14th century Syria and a skull-shaped pomander from 17th century England. There are intricate silver many-chambered balls and basket-shaped pendants that would have once housed fragrances like neroli, civet musk, ambergris, rose oil, and myrrh, a shell-shaped gold pendant that still has “traces of a red residue” inside its chambers, and even a pomander bead that was part of a devotional necklace or rosary and contained pictures of three female saints hidden behind spring mechanisms.
It’s one thing to inter a body in a church and allow visitors to pray over it on a Sunday, and quite another to take a fragment of finger bone, stick it in a tiny silver case, and wear it around your neck, but that’s exactly what people did.
If you didn’t want to carry around perfume, you could pack your pomander with an opium-laced mixture known as “Venice Treacle” in late medieval and early Renaissance England. (Opium was believed to be effective against the plague, so its usage was medicinal as well as recreational.) If you were really ambitious, maybe you’d wear a poison ring. It would be an easy way to defeat political rivals: Pour them a goblet of wine, flick the locking mechanism, and let the poison drop from your hand into their cup. Voilà, no more pesky Venetian cardinal or aggressive Flemish countess. According to legend, multiple members of the infamous Borgia family wore poisoned rings filled with cantarella, a custom concoction made by 16th century Italian merchants from either the juices of rotting pig entrails sprinkled with arsenic or the froth that accumulates on a poisoned pig’s mouth after it dies from arsenic poisoning — fables differ in the details.
Pomanders and poison rings weren’t truly that far from reliquaries in their design or their purpose. All of these things — saints’ bones, prayer snippets, rancid pig poison, sweet-smelling whale bile — were precious and private. They all afforded the wearer some sort of protection. Protection against the plague, protection against evil, protection against embarrassment. Even pomanders were about protection; it was often believed that illness spread through bad smells. According to the miasma theory, scents were a matter of life and death. A whiff of “bad air” could fell even the halest traveler. A pomander kept your smells from invading the rest of the world, and the world’s smells from infecting you.
There are examples of container jewelry from almost every era of human history and almost every corner of the globe. Perhaps there is something primal about our desire to squirrel away objects, to keep some precious little things on our bodies at all times. Maybe we need small things to feel big. I think, sometimes, that humans are drawn to things that are oversized and things that are terrifically undersized. Like Gulliver, we want to see worlds of both giants and manikins. We like dollhouses and lockets, giant nutcrackers and too-big wineglasses. These things remind of us childhood, and of dreams, places where reality is slippery and true faith is possible.
And maybe we hoard little parts of things in order to feel whole. Maybe prayers need something physical to attach to, hope needs something tangible to ground it, and grief a placeholder for an unspeakable absence.
* * *
Trends tend to grow slowly at first, bubbling under the surface of the collective consciousness. They simmer, sometimes for a few years, sometimes for a few hundred, until some precipitating event when suddenly, the once-obscure trend is everywhere.
That’s how it was with mourning jewelry. Since the 16th century, people had been commissioning jewelers to make them little mementos for their lost ones, rings and bracelets and lockets like the Chequers Ring, which has been dated to the mid-1570s and was worn by Queen Elizabeth I. The gold locket ring is in the shape of an E and adorned with white diamonds, rubies, and mother of pearl. Behind is a secret compartment with two enamel portraits believed to represent Queen Elizabeth herself and her mother, Anne Boleyn, who was executed when Elizabeth was nearly three years old. Pieces like the Chequers Ring are thematic siblings to the memento mori jewelry that was popular at the time, which often featured jeweled coffins, delicate gold skeletons, and other macabre bits of shiny symbolism. Instead of reminding the viewer that they, too, will die, mourning jewelry reminded the people that the wearer had experienced a loss, that they harbored great grief. Perhaps they also reminded the wearer that they had a right to their sadness. Mourning jewelry made absence visible and tangible. It made sadness present on the physical body.
Queen Victoria didn’t come up with the idea of mourning jewelry, but she did mourn more visibly and publicly than anyone else had, or could. Following the death of her husband Prince Albert in December 1861, Victoria entered a state of permanent mourning. She had the means to grieve decadently, and she did. She didn’t have just one locket for Albert, but several. She wore these charms on bracelets, broaches, and around her neck. It was her style; according to historian Claudia Acott Williams, Victoria’s first piece of sentimental jewelry was a gift from her mother and contained a lock of her deceased father’s hair, as well as several strands of her mother’s hair. During her very public courtship and wedding, “She and Albert would mark so many of those ubiquitous human moments that endeared her to the public with jewelry commissions that were widely publicized in the popular press and subsequently emulated by her subjects.” After Albert was gone, Victoria commissioned a gold memorial locket made with onyx and diamonds. Around the outside of the pendant, enamel letters spell out Die reine Seele schwingt sich auf zu Gott (“the pure soul flies up above to the Lord”). Inside, she placed a lock of Albert’s brown hair and a photograph of her deceased love. Victoria left instructions that, upon the occasion of her death, this locket be placed into Albert’s Room at Windsor Castle and left on display. It must have meant so much to her, that locket. It must have felt like a piece of her broken heart, an emotional wound made wearable and beautiful.
People of all socio-economic strata wore mourning jewelry of some kind. After all, you didn’t need to use costly gems; you could just give the deceased a post-mortem haircut and use the strands to create a bracelet or a ring. Some jewelry even featured bones in place of jewels (Victoria had a gold thistle brooch set with her daughter Vicky’s first lost milk tooth in place of the flower), though this wasn’t nearly as common as jewelry that featured woven, braided, or knotted hair. “If you’re poor, you wouldn’t have access to photography. That’s too expensive,” says Art of Mourning’s Peters. “But you could cut your hair off and pop it in a locket and give it to someone you love. That way, you can be with them always.”
Peters also notes many jewelers trying to capitalize on the trend played a bit fast and loose with the sources for their hair weavings. Sometimes you’d go to a craftsperson and ask that a locket be made with your beloved’s hair, and you’d return home with a piece made from their hair — and then some. “A lot of the hair they used was from nunneries,” he explains. Some customers knew that the hair was being supplemented, but not everyone was aware of this practice.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Even more disturbing to Peters was the role that advertising played in the promotion of mourning goods and rituals. “Exploitation of death through grief is as certain as death itself,” writes Peters in an essay published in A Miscellany of Death & Folly. “In particular, fashion has been a focal point through which death has been exploited, due to its highly emotive nature.” Department stores stocked solely with mourning paraphernalia began to pop up. Peters makes it clear that these items weren’t necessarily all that personal. Often, each mourner that attended a funeral would be gifted a simple ring, and people tended to judge the lives of their peers by the type and quality of jewelry they left behind for grieving friends and neighbors.
The sentimental jewelry trend wasn’t confined to the Continent. It was also fashionable in America to wear hair brooches, silver lockets, and other personal pieces. After the Industrial Revolution, people from most social classes could buy mass produced lockets, which they could then fill with photographs of their beloved or bits of their hair. Many of these were made in Newark, New Jersey, the jewelry manufacturing capital of the United States. The industry got its start there in the early 1800s, and by the late 1920s, Newark was producing 90 percent of the 14-karat gold jewelry in America. Alongside the full-color images of filigree gold pendants and colorful “fruit salad” bracelets and the essays about the shifting trends in American consumerism, The Glitter & The Gold: Fashioning America’s Jewelry tells tales of abuse and exploitation. Though the journeyman jewelers were fairly well paid, conditions in factories were generally grim and child labor was commonplace. Paid far less than their male coworkers, girls were often employed to do the most precise handwork, like fashioning gold watch chains or hand-painting enamel, because of their thin and dexterous fingers. “The jewelers work, in all its branches, is particularly trying to the eyes, and it not infrequently happens that defective sight compels men to abandon the trade,” reported chief of the state’s Bureau of Statistics of Labor and Industries around the turn of the twentieth century. Smead adds that “respiratory disorders were also common — common enough to be the leading cause of death among jewelers.”
* * *
By the time the Civil War came about, many middle class Americans were purchasing costume and fine jewelry that was made in Newark (though often factories would mark their goods “London” or “Paris” since U.S.-made items wouldn’t come into vogue for another fifty years). Lockets, heart-shaped and oval, were particularly popular during this socially chaotic period, and showed up frequently in literature and art. It was common practice for soldiers and their sweethearts to exchange sentimental trinkets before the man marched off to battle. A posthumously published and mostly-forgotten short story by Kate Chopin makes one such piece a central player: “The Locket” switches perspectives between a young Confederate soldier and his sweetheart. He had been wearing a locket, given to him by his girl at home, which he refers to as his good luck charm. After the battle, the same gold necklace is plucked off a corpse and mailed to the girl, who assumes that her love was killed. At the end, he returns home to find his lover dressed all in black. Another boy died, one who stole the locket believing that its “voodoo” would keep him alive. Our ersatz hero lives, thank the gods of love.
It’s a sentimental story about a sentimental piece of jewelry, and I can’t say I liked it much. It reminds me of a Nicholas Sparks story, or a Thomas Kinkade painting, or any other corny, sappy work of art. It drips with tears and snot. It has a hollow core: too much emotion, not enough meat. The story is set up as a tragedy, but at the last minute, Chopin pulls the rug out from under the reader and wraps them in a cozy blanket. Here, she says, here is what you wanted.
As for the boy who died? Well, we’re not supposed to think hard about him. Surely he deserved to die, for he was a thief and a coward. Like most sentimental works, it follows pat beats: a problem is set up, an exchange happens, a resolution is reached. In the end, the titular locket is revealed to have had no power — except to trick the woman into believing her love was lost, and perhaps to trick the robber into thinking he was safe on the battlefield.
That’s the dirty heart of the story. Maybe it’s not about the character’s great love, but the reader’s great fear. Fear that there is no protection from death, that there is no charm to keep away loss. Fear that unlike the boy in the story, your boy won’t come back.
Twenty-first century mourning has gone in two very different directions. It’s either become entirely intangible or deeply physical, almost to an obsessive degree. There are online guest books to mourn the dead, ghostly Facebook pages that live on “in legacy,” and online grief support groups, or you can buy diamonds made from the hair and ashes of a dead loved one. “Cremation diamonds are forever since they are diamonds made out of human ashes,” reads the website for Lonité, a Switzerland-based company that pressurizes the carbon-rich remnants of a body in order to “grow” amber-colored jewels that start at $1250 per quarter-carat, significantly less than most mined diamonds but slightly more than the average lab-grown diamond. Other companies will turn your ashes into glass beads or encase them in clay or metal. And while hair jewelry isn’t quite as fashionable as it once was, there are still hair artists who can weave a lock of hair into a keepsake.
It’s tempting to conclude that the ugliest part of lockets is what we put inside them—the poison, the remnants, the evidence of adultery, and the perfumed animal oils. But I think the worst part is how desperately we try to shrink down our emotions, to make them small and private and containable. Instead of sharing our fears aloud or wearing our sadness on the surface, we place it into jeweled containers, objects that latch and close and can be tucked under the shirt, inside the dress. We sublimate our emotions, turning gray flat ashes into brilliant, sparkling diamonds.
It must have meant so much to her, that locket. It must have felt like a piece of her broken heart, an emotional wound made wearable and beautiful.
“If we can be called best at anything,” writes mortician and author Caitlin Doughty in From Here to Eternity, “it would be at keeping our grieving families separated from their dead.” She goes to a village in Indonesia, where dead bodies are paraded through the streets while mourners keen and wail and cheer; Mexico, where mummies sit on altars waiting for families to come and give them gifts; and Japan, where family members visit a high-tech crematorium to gather up fragments of their lost and loved with chopsticks. To Americans, she admits, these customs may seem disrespectful. But they are not. They’re ways of working through grief. Giving mourners a task grants them purpose and a sense of control. Giving mourners a public space to celebrate their dead offers much-needed moments of physical and emotional catharsis. Giving mourners access to the dead body provides a sense of closeness and closure.
American culture lacks these rituals. Instead, we have single-day funerals. We have mass-produced headstones, mass-produced urns, mass-produced lockets that allow us to minimize loss without moving through it. There is no federal law that grants paid bereavement leave, not even for the death of a spouse or a child. Your interior world may have collapsed, but you are still expected to prove your worth. Grieve, but be productive.
Peters argues that hair art isn’t morbid, but rather a healthy sign that people can “live with” grief. I’m not so sure. I tend to agree more with McLaughlin, who stresses the locked-away part of the locket. “Lately, I feel like everything is about control,” says McLaughlin. “The world is bursting into flames around us and there’s basically nothing we can do about it, so instead we cling harder to the tiny things that mean something to us.” And maybe, she adds, the act of keeping these things “close and hidden away from others heightens that feeling of safety and control.” We don’t come together and howl in grief. We don’t keen at the sky or wail around the pyre or hold our dead tightly and brush their hair.
I have a cousin who died young from suicide. He was a few years older than me, and I spent the first sixteen years of my life looking up to him. He painted his nails with sparkly blue polish and dyed his hair black. He could do an incredible Irish accent. He took drugs and defended me from the worst abuses of my older brother. He was protective of me, and I loved him for it. I have very few memories of the funeral. I was deep in a depression of my own, and hadn’t yet discovered the value of medication. Many of my memories from those years are foggy and insubstantial, clouded by grief, marijuana, and hormones. I sometimes re-read the guestbook at Legacy.com where people write him messages. I receive email alerts when new posts are added. I am glad it exists, but it feels terribly incomplete. In grief, everything feels incomplete.
I do not have a necklace with a locket holding his dyed hair, but I do have a tiny little pill container that attaches to my key ring. In it, I have three pills. They soothe me, they calm me, they give me a sense of control. It’s with me at all times. I have often dared to imagine a world where I didn’t need them. Where I could cry in public, wail on the street, get snot and tears on my good clothes. Where I could allow emotions to be as big as they needed to be. Until then, I have my version of the poison ring, the pomander ball, the little locket, designed to protect. Designed to contain.
* * *
Katy Kelleher is a freelance writer and editor based in Maine whose work has appeared in Art New England, Boston magazine, The Paris Review, The Hairpin, Eater, Jezebel, and The New York Times Magazine. She’s also the author of the book Handcrafted Maine.
It starts with a road, a two-lane blacktop called West Virginia Route 219 that spines its way through Pocahontas County and serves, depending on the stretch, as main street and back street, freeway and byway, sidewalk and catwalk.
It is June 25, 1980, just after the summer solstice, and a young man named Tim is driving home for the night. He had driven to Lewisburg, the big town almost an hour away, and is coming back now, with fresh laundry and groceries.
Katy Kelleher | Longreads | October 2019 | 18 minutes (4,621 words)
In The Ugly History of Beautiful Things, Katy Kelleher lays bare the dark underbellies of the objects and substances we adorn ourselves with.
* * *
Everyone thought it was gone. The woods would no longer welcome the late-spring appearance of its pendulous yellow lip, twisted maroon petals, and thick green foliage. Although lady’s slipper orchids continued to bloom throughout the wild woods of Europe and North America, this particular species (Cypripedium calceolus) had been declared extinct in England as of 1917. Collectors had destroyed the plant in the early 20th century, suffering from what was then known as “orchidelirium,” an incurable psychological illness marked by a need to pillage and possess, to strip the landscape bare and imprison one’s precious findings behind the four walls of a personal greenhouse.
But Cypripedium calceolus wasn’t entirely lost. There were a few small plants growing wild from seed, working their thick white roots into the forest soil. It grew slowly and survived in secret. When a botanist found one growing in Yorkshire in the ’30s, it was kept secret. Botanists feared the plant would be poached again, and so for four decades, no one knew about the lady’s slipper’s return to Britain.
Eventually, the secret got out. While botanists worked to reintroduce the flower to the wild and start a new population of yellow-lobed blossoms, collectors caught wind of the miraculous return of the lady’s slipper. For a while, the specimen — growing on the Silverdale Golf Course — was relatively safe, thanks to its obscurity. Then, in 2004, someone got greedy. A thief stole onto the grounds in the middle of the night and attempted to steal an entire plant. It was found later, mangled, but still alive; the thief got away with a small cutting. In 2009, another poacher got away with a large piece of orchid, leaving just six flowers behind.
The orchid is now under police protection during its flowering months, from late May to early July. As far as I can tell, they set up police tape around the growing area, assign an officer to regularly patrol the course on foot, and considered putting in CCTV cameras, though it’s unclear whether they actually ever began to film the plant. The tape and the patrolman, however, remain as a deterrent, and the plant, one of about a dozen in the U.K., continues to flower annually.
Orchid mania didn’t begin with lady’s slippers. It began with exotic specimens, introduced to English gardeners and noblemen in the late 18th century. While many of them had seen botanical drawings of tropical orchids, the live specimens were something else entirely. Their strangely shaped flowers and bright colors sparked a fixation that came to exemplify the values of the period, for the heroic white adventurer who risks his life to harvest the knowledge and beauty of other lands, returning victorious to his home after striding across harsh landscapes, battling his way through jungles, and fighting man and beast to achieve his goals. The orchid stood for supremacy — of knowledge, of culture, of whiteness. It stood for expansion and colonialism. The way Western countries have treated orchids reflects how we’ve come to understand entire sections of the map. Instead of the old saying, “Here there be dragons,” Western explorers looked at the blank areas of their maps and thought, Here there be loot.
If Cypripedium calceolus is afforded official privileges, it’s not because of its beauty. It’s for its symbolism: It’s a stand-in for Britain’s native wildlife. Visiting this rare flower is a way for people to show their fealty to the land itself, to participate in a romantic rewriting of history, where they always loved their green islands and white cliffs and were only ever trying to extend those same gifts to others.
* * *
It is not often that a plant inspires pilgrimages or gets police protection; for the most part, we view plants as one of the lowest forms of life. The hierarchy is usually: human, animal, insect, plant, fungi, bacteria, virus. We assumed for centuries that plants were stationary, unthinking, unfeeling, and unable to send even rudimentary messages to one another (we now have evidence that this is untrue — plants do talk, plants do listen). For centuries, we’ve valued plants primarily based on how good they are for eating, or for looking at. Until we began to understand more complex scientific ideas like ecological diversity, carbon sequestering, and rewilding, those were our primary motivations for growing plants: taste and beauty.
Orchids have no taste, though many are edible. (Orchid petals taste, I can report, like water.) What they have by the boatload are looks. I think of orchids like little dandies, dressed in different outfits for different occasions. There are sturdy orchids that grow from swamps and would seem to enjoy long meandering walks through the countryside in tweed and green wellies. There are delicate orchids that do not like to be moved and restrict themselves to flashing their colors at passersby from their perch in the trees, like a glam wedding guest toasting the bride from a corner. There are orchids that look like ballerinas, dressed in tutus for their next performance, and orchids that look like businessmen, stiff and upright and ready to work.
Orchids, as a plant, may date back as far as 50 to 100 million years, making both the Victorian orchid craze and the contemporary passion for orchids a blip in their overall history. While we weren’t paying attention, they were evolving complex pollination mechanisms. They were forging relationships with bees and other insects, becoming increasingly specialized. They were growing in ever more fantastic shapes and developing ever more unlikely adaptations. Members of the orchid family grow absolutely everywhere — on every inhabitable continent, which just means they haven’t figured out a way to thrive in Antarctica yet. There are about 28,000 currently accepted species of orchid (which doesn’t include 100,000 or so hybrids and cultivars introduced since the Victorian period). They live in the temperate woodlands of Sweden and in the arid rocky soil of Arizona. They hang from trees in humid tropical jungles and decorate the mountains of the Middle East.
There are orchids that look like ballerinas, dressed in tutus for their next performance, and orchids that look like businessmen, stiff and upright and ready to work.
Yet when most people close their eyes and imagine an orchid, they picture a tropical variety. Perhaps the moth orchid, which you can buy in almost any grocery store or gift shop. These orchids have big fuchsia or white petals and sepals surrounding a delicately proportioned “lip” and “throat” (i.e., the flower’s sex organs). Or maybe they picture the pale and eerie ghost orchid, the subject of Susan Orlean’s The Orchid Thief, a book that served as source material for the Academy Award–winning movie Adaptation. Meme lovers might know about the monkey-faced Dracula orchid, whose flowers resemble little simian faces, or the Italian orchid, which looks like a big-dicked stick figure (thus earning the nickname the “naked man orchid”). And there are plenty more orchids that you wouldn’t even know are orchids. I had a weird little plant growing in a pot in my bathroom; I’d dug it up from my backyard because I liked its broad variegated leaves. Only in researching this piece did I discover that I, a known killer of potted orchids, have been growing one for months — the downy rattlesnake plantain. But these ordinary orchids — the spiky green bog orchids and plain pale ladies’ tresses — didn’t change the history of knowledge. Not like those flashy tropical flowers did. North American and English native orchids are important to their ecosystems, but they’re not the ones that caught Charles Darwin’s eye.
Darwin’s admiration for fauna is well documented in On the Origin of Species (1859), but people often forget about his devotion to flora. Even Darwin calls his 1862 orchid study a “little book,” but it was a little book with a long name — On the Various Contrivances by Which British and Foreign Orchids are Fertilised by Insects, and on the good effects of intercrossing — and a big impact. The dense book argued that “every trifling detail” of orchid structure was not necessarily the result of “the direct interposition of the Creator,” but of centuries of wooing insects into their hairy parts. Although orchids have both “male” and “female” organs (stamens and pistils) contained within one flower, they don’t pollinate their own ova. Instead, they work with insects to get the job done, ensuring intercrossing rather than inbreeding. (Darwin may have had a personal stake in his argument; he felt quite a lot of guilt over marrying his first cousin, an act that he thought may have contributed to the deaths of his “rather sickly” children. “If inbreeding was bad for Charles and Emma’s offspring,” Jim Endersby writes in in Orchid, a Cultural History, “self-fertilization (the ultimate form of inbreeding) ought to be especially bad.”)
In efforts to attract insects and spread their pollen, orchids have developed some truly wild shapes. Oncidium henekenii is an iridescent red flower with yellow ruffled petals that looks quite a lot like a “fetching female bee,” according to David Horak of the Brooklyn Botanic Garden. The orchid not only looks like a bee, it smells like one. “When the male lands on the flower, it grabs the labellum and attempts to copulate with it,” writes Horak. “In the process, the flower deposits pollinia on the insect’s head, to be carried to the next flower he visits.” Other orchids lure in insects with colors and shapes that mimic those of more nutritious flowers. Orchids pollinated by flies or carrion beetles are often brown and reek of rotting flesh. Slipper orchids are some of the most devious; they use their big, bucket-shaped labellum to trap bees and bugs. The bugs fly in, thinking they’re going to get some nice sweet nectar, and find themselves stuck in an empty cavity. The only way out is through a hairy hole, just big enough for the insect to sneak through. As the still-hungry insects climb out, they brush against the pollen-covered hairs and leave decorated with the orchid version of semen.
These adaptations have compelled Micheal Pollan to call orchids “the inflatable love dolls of the floral kingdom,” skilled practitioners of “sexual deception.” Orchids are, according to Pollan, rather fantastic liars who evolved alongside insects, luring them in time and again with the promise of “very weird sex.” Thanks to this long-term fuck-buddy relationship, there are plenty of orchid species that can only be pollinated by a specific corresponding insect species. After learning a few of their adaptations, you can spot patterns, see which lock will fit which key. Darwin’s study of orchids lead him to prophesize the existence of a long-tongued moth when an orchid grower in Madagascar sent him a sample of a star-shaped white orchid with a long, dangling nectary that could grow to almost a full foot long. Upon seeing it, he wrote a friend, “Good Heavens what insect can suck it?” before going on to suggest that, “in Madagascar there must be moths with probosces capable of extension to a length of between ten and eleven inches.” Two decades after Darwin died, scientists found a subspecies of Congo moth (commonly known as Morgan’s spinx moth) with a prolonged proboscis.
It wouldn’t have been possible for Darwin to examine orchids so closely without access to orchids. While his other works had him trotting around the globe, he researched his little orchid book while hanging out with his family in England. At this time, growing tropical orchids in backyard greenhouses was an incredibly popular pastime for upper- and middle-class men. It supposedly started in the early 1800s, when British naturalist named William John Swainson sent a bunch of orchid tubers back from Brazil. Ironically, Swainson had used the tubers to package other specimens, but the tubers grew and blossomed, surprising everyone. The 1800s also saw the golden era of the modern greenhouse, an architectural movement spearheaded in England by Sir Joseph Paxton. A gardener who rose to knighthood, Paxton created one of the first modern English greenhouses for the Duke of Devonshire in the 1830s (Paxton later designed the famous Crystal Palace for the Great Exhibition of 1851). The visibility of these elegant glass structures inspired a proliferation of greenhouse building among the upper classes. Made with iron bars and cheap, factory-made glass, these grow houses gave people a place to grow tropical plants that wouldn’t otherwise thrive in England’s temperate climate. This was also a period of rapid imperial growth and expansion that brought more orchid varieties to English shores. “Local networks of colonists, missionaries, and traders made it easier to recruit indigenous guides and porters, and to obtain information and supplies that allowed expeditions to reach and explore previously un-botanized areas,” writes Endersby.
As more and more orchids arrived in England, the flower became further coded. Any old gardener could grow a rose bush, but to grow an orchid you needed a greenhouse — and connections. James Bateman’s 1845 book The Orchidaceae of Mexico and Guatemala speculated that “Orchido-Mania” pervaded all classes, but especially the “upper.” Bateman also suggested that orchids were nature’s green patricians. According to Endersby, Bateman wanted hobbyist gardeners to stay in their lane. Aristocratic people should grow aristocratic flowers, for “the happiness of the community at large.” This is but one reading of Bateman’s argument — he also makes it clear that all of society can benefit from seeing greater plant diversity — yet Bateman’s words still reflect a certain sense of noblesse oblige. It was inevitable, Bateman thought, that the upper classes would grow orchids and the lower classes would grow humbler flowers like tulips and carnations. It may not have been ideal, but it was the way of the world.
The high expense of orchid-rearing didn’t much deter the rise of floral madness. Those who couldn’t participate firsthand were able to live vicariously through the legendary antics of plant poachers. People were hungry for exotic flowers, and equally hungry for stories of their capture. Dozens of orchid hunters died abroad, killed by illness, accident, or foul play. “In 1901, eight orchid hunters went on an expedition to the Philippines,” writes Orlean in The Orchid Thief. “Within a month one of them had been eaten by a tiger; another had been drenched with oil and burned alive; five had vanished into thin air; and one had managed to stay alive.” The last man standing walked out of the jungle with either 47,000 or 7,000 orchids, depending on the source. In 1891, an Englishman named Albert Millican published a memoir of his time spent orchid-hunting in the Andes, Travels and Adventures of an Orchid Hunter. As he travels through the Andes, he meets Native men and women who he disparages and lusts after, respectively. He sees his companions pierced with poison arrows and doesn’t seem particularly bothered by their passing. He also doesn’t seem to love orchids all that much: They were a means to an end. Poachers would harvest as many specimens as they could, leaving no tubers left to regrow the population. Some orchid hunters cared about scientific advancements, certainly, but most were after more money and fame. They could come back with both high-priced stock and tales of wild panthers and wild women, cannibals and conquests.
Dozens of orchid hunters died abroad, killed by illness, accident, or foul play.
As the 19th century wore on, orchids and death became more explicitly associated. It wasn’t just that people died in their quests to procure them; orchids themselves were also seen as deadly. Stories circulated about orchids found growing in graveyards and on human remains. “In the late 1800s an Englishman in New Guinea discovered a new variety of orchid growing in a cemetery,” writes Orlean. “Without bothering to get permission he dug up the graves and collected the flowers.” (He gave the people of the nearby town a few glass beads to pay for his desecration of their ancestors.) Another orchid hunter sent home plants attached to shin bones and ribs, and still another brought a flower growing from a human skull. This last find was auctioned off at Protheroe’s of London, sparking a series of think pieces on these gothic curiosities, these bloody orchids.
As in life so in fiction, and 19th- and 20th-century pulp literature is awash with dangerous flowers. My favorite entry into this highly specific canon is The Flowering of the Strange Orchid by H.G. Wells. First published in 1894, it tells of a short, nebbishy orchid collector named Winter Wedderburn who laments to his housekeeper that, “nothing ever happens to me.” Later that day, he goes into London and returns with several orchid roots. Most of them are identified by the sellers, but one is not. “I don’t like the look of it,” says his housekeeper, comparing it to a “a spider shamming dead” or “fingers trying to get at you,” before defensively telling her boss, “I can’t help my likes and dislikes.” But to Wedderburn, this root is an opportunity. Something, he hopes, might happen.
Of course, something does happen. After time in his overly hot greenhouse, the orchid blossoms. The “rich, intensely sweet” scent of the flowers makes him dizzy; it overpowers all other smells in the greenhouse. It also overpowers Wedderburn who passes out, to be found later by his trusty housekeeper. He is alive, but barely: Fingerlike aerial roots have swarmed over his body, “a tangle of grey ropes, stretched tight” attached by “leech-like suckers.” The housekeeper saves poor Wedderburn by breaking the windows and dragging him outside. The bloodthirsty orchid is left to die in the cold with all of Weddernburn’s other plants.
Once he recovers, Weddernburn finds himself thrilled by his little adventure. He’s had a brush with the exotic, hypermasculine world of orchid hunting, and he came out on top. What a feat for such a quiet, milquetoast little man.
* * *
At the age of 7, I became an orchid mangler, like the unnamed thief of Silverdale. I suppose I could claim I was struck by orchidelirium — it wasn’t my fault, officer! — but that’s not quite true. I had flower delirium in general; I picked flowers from my neighbor’s gardens and ate the violets that dotted our yards. I stole flowerheads from grocery store bouquets. I liked the colors. I wanted to keep them all, even the dyed carnations wrapped in cellophane, even the jewelweed that grew in the swampy parts of our neighborhood. I didn’t know that orchids were rare, nor would I have cared. I wanted one of those pink, bulbous flowers — a pale ballet pink, like the inside of a seashell or my mother’s fingernails — so I picked it. (When my mother found out she sat me down and explained endangered species. I never picked another lady’s slipper.)
Looking back, it shouldn’t have been hard to resist the call of the lady’s slipper. Lady’s slippers are, in my opinion, kind of ugly. Our New England variety reminds me of human testicles, covered in spiderlike veins, more fleshy than flashy.
This isn’t a terribly imaginative comparison; orchids have been associated with balls since ancient times. The word “orchid” comes from the Greek word for testicle, órkhis. The Greeks were inspired by the plant’s rounded tubers, which often grow in a pair, one larger and one smaller. Ancient physicians believed that these roots could both cause erections and stop them, depending on which tuber you picked. (The aphrodisiac and the boner-killer followed the same recipe: Stew in goat’s milk, drink hot root broth, wait. The big one would make the organ swell, the small one would quell lust.) In medieval Europe, orchids often went by folk names, like fox stones, hares-bollocks, sweet cullions, dogstones, and goat’s stones. (In case further clarification is required: Stones, bollocks, and cullions are all vulgar synonyms for the family jewels.)
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
It’s difficult to say precisely when orchids became more closely associated with the female body, but during the height of orchid mania, these flowers were often understood as somehow feminine. This makes some visual sense: Aside from the roots, orchids tend to look more vaginal than phallic. But it’s not really about what the flower looks like. It’s about how they were collected, harvested, conquered, bred. And (as usual) it’s about sexism. Flowers were, like women, passive players in procreation. (Darwin didn’t have this hang-up, a small point in his favor.) A 19th-century growing manual would deem orchids “marvelously docile … as with women and chameleons, their life is the reflection of what is around them.”
When orchids were given agency, they were seen as treacherous. Their sweet scent could lure you in, their beauty might trick you into doing something foolhardy, their silent presence was enough to drive a man wild. Orchids were the femme fatales of the flower world. Popular short stories like “The Purple Terror” by Fred M. White (1898) and “The Orchid Horror” by John Blunt (1911), as well as novels like Woman of the Orchids by Marvin Hill Dana (1901) blur the line between blossom and woman. In each of these narratives, the reader is cast in the role of the male explorer who is seduced by both the promise of fabulous flowers and the hope to get closer to an alluring, exotic woman. For Endersby, these stories show not only the fear of women’s shifting societal roles, but also the fear of (and desire for) the tropics, “ripe with sickness and scheming natives, embodied in seductive exotic women.” He goes on to suggest that dangerous orchids like Wedderburns’ “seem to imbue women with qualities that were simultaneously repellant and seductive.”
The role of the orchid collector, then, was to tame the dangerous woman. To own her, to coax forth her beauty in a safe, contained space. To take her out of her natural habitat and show her how to live; growing orchids as wish-fulfillment. It allowed these men to feel virile and manly, as though they had imposed their will on nature itself. Inside the tidy walls of a steel-reinforced greenhouse, they could be masters of their own little harem. If Hugh Hefner had been born 100 years earlier, I imagine he would have kept orchids.
* * *
As we slide further into the 21st century, the echoes of orchid mania still reverberate. The contemporary collector still dreams of a chance to play Columbus, to discover a new species and slap his name on it. I didn’t know this when I first visited the Montreal Botanical Garden in winter of 2019. I only knew that I wanted to get warm and to see some interesting greenery. I saw yellow orchids and pink orchids and so many white frilly orchids. I also saw the fuchsia petals of the famous Phragmipedium kovachii slipper orchid.
The story of the kovachii flower is covered at length in Craig Pittman’s riveting book The Scent of Scandal, but in short: In 2002, an American orchid collector named Michael Kovach was traveling with his friend, “The Adventurer” Lee Moore (this nickname is printed on his business cards, so he’s that kind of guy), when the duo came across a roadside stand selling huge magenta orchids. The slipper orchids had brightly colored labellum surrounded by two massive petals and were about the size of a hand, fairly large for an orchid. Kovach was psyched to have discovered an undocumented species, bought several of the plants, and brought them back to America. He didn’t, however, get the proper permission to do so. He didn’t fill out the paperwork, he didn’t wait to get approval. He just packed them in his suitcase and brought them to America.
Inside the tidy walls of a steel-reinforced greenhouse, they could be masters of their own little harem. If Hugh Hefner had been born 100 years earlier, I imagine he would have kept orchids.
You can’t just take wild orchids from one country to another — there are rules about these things. Orchids are covered by an international treaty called the Convention on International Trade in Endangered Species (CITES), which specifies that you can only export orchids that were grown in a nursery or a laboratory. It’s illegal to fly out of the country with a wild orchid and bring it to your favorite botanical garden, where you hand it over to the researchers and suggest that they name the new species after you.
That’s exactly what Kovach did, with widespread repercussions for both the botanical garden and other orchid importers. Kovach was punished, as was another importer from Texas, who also brought in illegal plants (while Kovach didn’t receive jail time — only probation and a fine — others weren’t so fortunate). It was a huge legal case, though Stéphane M. Bailleul of the Montreal Botanical Garden says it’s just “human nature that prevented everything from being done properly.” (Tell that to the scientists in Peru, who were pretty pissed that an American got to name one of their native species.) The case, Bailleul says, “highlights the difficulty of getting new species out and describing new species. The intention wasn’t to plunder the population, the intention was to describe the species, to examine it, to take the measurements,” which may be both true and the most generous reading of events.
Pittman, author of The Scent of Scandal, has a slightly different take. Orchid people, he explains, “tend to be obsessive, fairly well educated, and somewhat opinionated.” Pittman believes that orchid collectors lust after rare plants primarily because they “want to feel special. They want to feel superior to others.” Even if no one else sees your collection, you know you have something special, something exotic and singular and strange. But Pittman also seems to suggest that Kovach, Moore, and the team of scientists at Selby all believed that they were doing the right thing, at least to some extent, by describing the species. They were making the plant known. They were adding to scientific knowledge, expanding our collective understanding of the wild world of plants.
Yet this is precisely what stuck with me after I closed Pittman’s book and picked up my next orchid-centric read, Orlean’s The Orchid Thief. It seems to make sense that scientific advancement is worth it, that it is for the good of all humanity that we dig as deeply into the natural world as possible, understanding every nook and cranny and leaf and bee. Even if it means we’re steamrolling over other countries’ rights to “discover” their own plants. Kovachii is a rare, prized species of orchid, one that you can visit at many major botanical gardens. I, personally, have benefited from this theft, even if I didn’t know it at the time. I saw something rare, something special, something new to the world of science.
And yet, what would have happened if we’d left orchids where they were? What would have happened if we’d left countries as they were, people as they were? The lust for orchids is fueled by our appreciation for beauty, our love of bright colors. But lots of flowers are pretty, so it’s safe to say this particular phenomenon isn’t just about prettiness. Orchid mania is an ongoing illness that reflects a sickness at the heart of Western culture where white scientists know best, Western countries deserve to rule over realms of knowledge and beauty and truth, and America and England get to write the stories of the world and determine what species gets which name. The story of orchid madness isn’t just a story of quirky adventurers and daring British men facing down tigers. It’s also a story of masculinity, white supremacy, and entitlement. It doesn’t matter whether the first tropical orchid sailed into England thanks to a packing mistake. It doesn’t even matter whether all the orchids we collect now are coming here by the book. Orchid madness persists and has spread to local plants and endangered species on golf courses and in backyards. When you boil it down, it’s all about the impulse to pull something up, root and stem, to possess a piece of beauty even as you know, logically, that you’re going to kill it. It’s not a story of loving something to death, as I first thought. It’s a story about the fetid swamp of desire that grows within all of us, a place where entitlement festers in deep water polluted by history, by cultural forces we don’t dare to name.
* * *
Katy Kelleher is a freelance writer and editor based in Maine whose work has appeared in Art New England, Boston magazine, The Paris Review, The Hairpin, Eater, Jezebel, and The New York Times Magazine. She’s also the author of the book Handcrafted Maine.
Editor: Michelle Weber
Factchecker: Jason Stavers
Copy editor: Jacob Z. Gross
Katy Kelleher | Longreads | July 2019 | 21 minutes (5,409 words)
In The Ugly History of Beautiful Things, Katy Kelleher lays bare the dark underbellies of the objects and substances we adorn ourselves with.
* * *
Eight thousand years ago, a craftsperson sat inside their mud-brick house in Turkey and rubbed a piece of obsidian with their hands, smoothing the surface carefully, polishing the stone until it shone darkly in the hot sun, burning a piece of volcanic rock into something miraculous. In this piece of black stone, they could see their reflection, surrounded by the walls of their dwelling, built on the bones of their ancestors, the painted plaster walls rendered colorless by the obsidian’s deep gloss. But they weren’t done. They took white plaster and applied it to one side of this stone disk in a conical shape. Eventually this stone came to rest in a grave, alongside a woman from the early agricultural society. There it stayed until archeologists found it in the 1960s. It is, as far as we know, one of humankind’s first mirrors.
According to archeologist Ian Hodder, who oversees the hilly, 34-acre archeological site at Çatalhöyük in central Turkey, there have been “five or six” obsidian mirrors found there, all located in the northeast corners of tombs belonging to women. “They are beautiful things,” he says of the Neolithic mirrors. “Nobody really expected there would be things like mirrors in those early days. These are the first sort of settlements after people have been living as hunters and gathers. In many ways, these were quite simple societies, so it is odd.” Yet these early proto-urban people clearly wanted to look at themselves — or at something. It’s possible they were used in rituals by shamans or other religious figures. “One of the most commonly suggested for the time period is that they’re something to do with predicting the future or understanding the spirit world through reading images in the mirrors,” says Hodder. We just don’t know. We’ll probably never know.
With a name taken from the Latin mirare and mirari (“to look at” and “to wonder at, admire,” respectively), a mirror can be any reflective surface created for the purpose of seeing oneself. They can be made of stone, metal, glass, plastic, or even water. Throughout history, we’ve constructed mirrors from all those substances, to a varying degree of efficacy, for various reasons. Some were used as ceremonial items, others were used to repel malevolent spirits, and still others were used for the simple pleasure of examining one’s countenance.
But no matter what they’re made of, mirrors are objects of mystery, obsession, and fear. They’re simple yet complex. They’ve been used for purposes both sacred and profane. We love them, yet we’re loath to admit it. Even their creation has been shrouded in secrecy and aided by willful ignorance and sometimes outright violence; mirror making was once a toxic affair, and its secrets were guarded by laws and punishable by death. Long reserved for the wealthy few, we now walk around with compact mirrors in our pockets, and even if you left yours at home, there’s always a cell phone screen that can function, if you want it to, if the light is right, as a mirror.
Often, when objects become mundane, they lose some of their luster. But mirrors retain their ability to hold our attention, and they retain a certain amount of power over us. We’re still interested in seeing our reflections, and we still want to know what the future holds. Yet we’ve lost the reverence we once had for them. We no longer bury our dead with hand mirrors, and we don’t often speak of the control a mirror can exert over a person. Instead, we allow this force to alter our perceptions, to diminish our happiness, while denying its power. Looking in a mirror is just something you do — just something women do. We’re so used to seeing this impulse as vanity that most of us have forgotten the innate sense of awe that comes with looking. We’ve forgotten how to face our reflections not with judgment or fear, but with a sense of joyful discovery, a sense of hope. We can see our reflections anywhere, yet still face the mirror with a certain amount of suspicion, as though desiring knowledge of how the world sees you is somehow wrong. Read more…
Katy Kelleher | Longreads | March 2019 | 16 minutes (4,107 words)
* * *
“There was once upon a time a very old woman, who lived with her flock of geese in a waste place among the mountains, and there had a little house,” begins The Goose Girl at the Well. Published by the Brothers Grimm, this strange little story describes a princess who comes to live with a poor crone in that wretched waste place after she fails her father’s Lear-like test to profess her love and devotion. The girl is lovely, as befits a fairy-tale princess — “white as snow, as rosy as apple-blossom, and her hair as radiant as sun-beams” — but there is one detail that always snags in my mind: “When she cried, not tears fell from her eyes, but pearls and jewels only.”
The rest of the story is a bit boring, I’m sorry to say. The girl returns home, the king learns his folly, and the old woman disappears into thin air, taking only the precious stones that fell from the girl’s magical tear ducts. But it ends on a funny note:
This much is certain, that the old woman was no witch, as people thought, but a wise woman who meant well. Very likely it was she who, at the princess’s birth, gave her the gift of weeping pearls instead of tears. That does not happen now-a-days, or else the poor would soon become rich.
I wish Grimm’s narrator had lived to see our world, one where pearls are so inexpensive that almost anyone can own a pearl necklace or a set of earrings. These gemstones are no longer precious, and they come neither from red-rimmed eyes nor from secret caverns in the ocean, but from underwater baskets strung together on sprawling sea-farms. Pearls were once mystical objects, believed by some to be the tears of Eve, by others to be the tears of Aphrodite. There are stories of pearls falling out of women’s mouths when they utter sweet words, and pearls appearing from the spray of sea foam as a goddess is born. Now we know better: pearls are made from some of the basic and common building blocks of nature — calcium, carbon, oxygen, arranged into calcium carbonate particles, bund together by organic proteins. They are created out of animal pain, which has been sublimated into something iridescent and smooth, layered and lovely. Born of irritation, these gemstones can be mass-produced and purchased with the click of a button. These gems, like so many things, have lost some of their luster thanks to the everyday degradation of value that comes with globalization and 24/7 access to consumer goods. Thanks to Amazon, you no longer need to plumb the depths of a river or visit a jeweler to purchase a set of freshwater pearl drops. With one-click ordering, you can have a pair of dangling ivory orbs delivered to your house within days — in some places, hours..
And yet: imagine opening an oyster and seeing that slimy amorphous lump of muscle, and nestled among it, a single pearl. The fact that such iridescent, shape-shifting beauty can come from a mucus-y mollusk remains something of a miracle, primal evidence that the world orients itself toward beauty. Or so I want to believe.
Soraya Roberts | Longreads | December 2018 | 10 minutes (2,554 words)
On November 30th, Tavi Gevinson published her last ever editor’s letter at Rookie. The 22-year-old started the site when she was just 15, and in the intervening years it had spawned a pastel-hued community of girlhood, which, if not always sparkly, was always honest. The letter spanned six pages, 5707 words. In Longreads terms, that’s 20 minutes, 20 minutes of Gevinson agonizing over the site she loved so much, the site that was so good, that was now bigger than her, that she couldn’t figure out how to save. “Rookie had been founded, in part, as a response to feeling constantly marketed to in almost all forms of media,” she wrote, “to being seen as a consumer rather than a reader or person.”
The market had won, but Gevinson was fighting to the death. It was hard to read. You could sense her torturing herself. And she was. Because in truth there was nothing Gevinson could have done, because the failure of Rookie was not about her, or even about the poor state of media as a whole. It was about what it has always been about, which is that as much power as women have online — as strong as their voices are, as good as their work is, as valuable as it is to women, especially young women — its intrinsic worth is something capitalism, dominated by men, feels no obligation to understand. This is what ultimately killed Rookie. And The Hairpin. And The Toast. And maybe even Lenny Letter too.
In her first ever editor’s letter, Tavi Gevinson explained that she wasn’t interested in the “average teenage girl,” or even in finding out who that was or whether Rookie appealed to her. “It seems that entire industries are based on answering these very questions,” she wrote. “Who is the typical teenage girl? What does she want? (And, a lot of the time, How can we get her allowance?)” She claimed not to have the answer but provided it anyway by not asking the question: by not inquiring, like other young women’s publications, whether her readers would like some lipstick or maybe some blush with that. Instead, Rookie existed in a state of flux, a mood board of art and writing and photography on popular culture and fashion and politics and, just, the reality of being a girl. In an interview with NPR in 2011, Gevinson noted the hypocrisy of other teen magazines’ feminist gestures: “they say something really simple about how you should love your body and be confident or whatever, but then in the actual magazine, there will still be stuff that maybe doesn’t really make you love your body.”
Writer Hazel Cills emailed Gevinson when she was 17 to ask if she could join Rookie. In her eulogy for the site, published in Jezebel, Cills described the magazine’s novel concept: “unlike Teen Vogue or Seventeen, we were overwhelmingly staffed with actual teenagers, and were free to write about our realities as if they were the stuff of serious journalism.” Lena Singer, who was in her 30s when she worked as Rookie’s managing editor, thinks the publication deserves some credit for the fact that adults are now more willing to defer to adolescents than they were when it launched. “Part of my role as an editor there was to help protect the idea — and I still believe it — that the world doesn’t need another adult’s opinion about teen spaces, online or elsewhere,” she says. “Teens say what needs to be known about that.” And when they didn’t have the answers, they chose which adults to consult with video features like “Ask a Grown Man,” where celebrities like Thom Yorke answered readers’ questions. The column would have been familiar to Sassy aficionados, particularly fans of its “Dear Boy” series which had guys like Beck offering advice. Which made sense, because Sassy was basically the OG Rookie.
Named by the 13-year-old daughter of one of the heads of its publishing company, Fairfax, Sassy arrived in 1988 and was the first American magazine that actually spoke the language of adolescence. Teen publications dated back to 1944, the year Seventeen launched, but Sassy was different. “The wink-wink, exasperated, bemused tone was completely unlike the vaguely disguised parental voice of Seventeen,” write Kara Jesella and Marisa Meltzer in How Sassy Changed My Life: A Love Letter to the Greatest Teen Magazine. And unlike Teen or YM, it did not make guys the goal and girls the competition — if it had a goal at all, it was to be smart (and preferably not a conservative). Sassy was launched as the U.S. iteration of the Australian magazine Dolly — they originally shared a publisher — and presented itself as the big sister telling you everything you needed to know about celebrity, fashion, and beauty but also drugs, sex, and politics. “The teen magazines here were like Good Housekeeping for teen-agers,” Dolly co-founder Sandra Yates told the New York Times in 1988, adding, “I’m going to prove that you can run a business with feminist principles and make money.”
So she hired Jane Pratt, an associate editor at Teenage magazine, who matched her polka dot skirt with work boots, who donated to a pro-choice organization. Pratt “cast” writers like Dolly did, then went further to reinforce their personalities by publishing more photos and encouraging them to write in the first person, with plenty of self-reference, culminating in a sort of reality TV show-slash-blog before either of those things existed. Sassy became ground zero for indie music coverage thanks largely to Christina Kelly, a fan of Slaves of New York author Tama Janowitz who wrote the way teenagers talk. “I don’t know how to say where my voice came from,” she says. “It was just there.” Like the other writers on staff, she offered a proto-Jezebel take on pop culture, a new form of postmodern love-hate criticism.
At its peak, Sassy, which had one of the most successful women’s magazine launches ever (per Jesella and Meltzer), attracted 800,000 readers. But this was the era of the feminist backlash, where politicians were doubling down on good old American family values. The writers and editors at Sassy weren’t activists, per se, but they were the children of second wavers, they went to universities with women’s departments, they knew about the patriarchy. “Sassy was like a Trojan horse,” wrote Jesella and Meltzer, “reaching girls who weren’t necessarily looking for a feminist message.” Realizing that adolescents were more sexually active, receiving letters about the shame around it, Sassy made it a priority to provide realistic accounts of sex without the moralism. They covered homosexuality, abortion, and even abuse, and were the first teen magazine in America to advertise condoms.
In response, right-wing religious groups petitioned to boycott Sassy‘s advertisers; within several months the magazine lost nearly nearly 20 percent of its advertising. After several changes in ownership, including the removal of Sandra Yates and a squarer mandate, the oxymoronic conservative Sassy eventually folded into Teen magazine in 1997, the alternative press devoured once again by the mainstream.
But Sassy left behind a community. A form of analog social media, the magazine united writers with readers, but also readers with each other. Sassy even had its readers conceptualize an issue in 1990 — the “first-ever reader-produced issue of a consumer magazine” — the same year Andi Zeisler secured an internship at Sassy with a hand-illustrated envelope and the straightforward line, “I want to be your intern.” Six years later, she co-created her own magazine, Bitch, a cross between Sassy and Ms. It had the same sort of intimate community where, Zeisler explains, “there’s somehow a collective feeling of ownership that you don’t have with something like Bustle.”
Bustle, a digital media company for millennial women, is often cited as the counter-example to indie sites like Sassy, Bitch, and Rookie. It has more than 50 million monthly uniques (Bustle alone boasts 37 million) and is run by a man named Bryan Goldberg, who upon its 2013 launch wrote, with a straight face, “Maybe we need a destination that is powered by the young women who currently occupy the bottom floors at major publishing houses.” While Sassy had to struggle to be profitable and sustainable in an ad-based and legacy driven industry, now corporate entities like Bustle manspread sites like Rookie into non-existence. “The one thing that has stayed the same,” says Zeisler, “is the fact that alternative presentations of media by and for girls and young women is really overlooked as a cultural force.”
Tavi Gevinson was born the year Sassy died, but Lena Dunham arrived just in time. Recalling her predecessor, she described her feminist newsletter, Lenny Letter, which launched in 2015 as “a big sister to young radical women on the Internet.” Delivered to your inbox, Lenny, backed by Hearst, mimicked the intimacy of magazines past, the ones that existed outside Twitter and the comments section. It included an advice column and interviews (the first was with Hillary Clinton) as well as personal essays touching on various sociopolitcal issues. It was more activist than Sassy, more earnest than ironic, more 20-something than adolescent. It even had a Rookie alum, Laia Garcia, as its deputy editor. Lenny’s third issue launched it into mainstream consciousness when Jennifer Lawrence wrote an essay about pay disparity in Hollywood, which provoked an industry-wide conversation. Then three years after launch and without warning, on October 19, a final letter by Dunham and co-creator Jenni Konner claimed “there’s no one reason for our closure” and shut down.
Lenny’s demise came nine months after that of another site that had a loyal female-driven community: The Hairpin. Founded in 2010 by Edith Zimmerman under The Awl umbrella, the site that had also published writing by Lenny editor-at-large Doreen St. Félix claimed “a natural end” — the same words The Awl used for its closure. NPR’s Glen Weldon suggested more specific reasons for their termination: the decline in ad revenue online, the sites’ unwillingness to compromise, their independence. “The Awl and The Hairpin were breeding grounds for new writers — like The National Lampoon in the ‘70s, Spy Magazine in the ‘80s, Sassy in the ‘90s and McSweeney’s in the aughts,” he explained, adding, “Invariably they would find, waiting for them, a comparatively small, but loyal, sympathetic and (mostly) supportive readership.”
Two years before this, a similar site, The Toast, founded by former Hairpinners Nicole Cliffe and Daniel Ortberg, also closed. The publication was created in 2013 to be an intersectional space for women to write basically whatever they fancied. They even invited Rookie to contribute. The Toast published multiple features a day, stating, “we think there’s value in posting things that we’ve invested time and energy on, even if it comes at the expense of ‘You won’t believe this story about the thing you saw on Twitter and have already believed’ link roundups.” In a lengthy message posted in May 2016, Ortberg broke down the financial circumstances that left them weighing their options. “Most of them would have necessitated turning The Toast into something we didn’t like, or continuing to work ourselves into the ground forever,” Ortberg wrote, adding, “The only regret I have is that Bustle will outlive us and I will never be able to icily reject a million-dollar check from Bryan Goldberg, but that’s pretty much it.”
It says everything about the American media industry that Bustle, a site with an owner who mansplained women’s sites to women, a site which acquired the social justice-oriented publication Mic only after it had laid off almost its entire staff, has outlived the ones that are actually powered by women. If you look closely, you will see that the majority of women’s sites that continue to exist — from SheKnows to Refinery29 — have men in charge. Even HelloGiggles, which was created by three women, is owned by the male-run Meredith Corporation. That means that, fundamentally, these publications are in the hands of a gender that does not historically believe in the inherent value of women’s media. Women, including young women, are valuable as consumers, but if their interests cannot be monetized, they are worthless. Yet the same year The Toast closed, Lauren Duca wrote a Sassy-style essay, “Donald Trump Is Gaslighting America,” in Teen Vogue which dominated the news and garnered 1.4 million unique visitors. “Teen girls are so much smarter than anyone gives them credit for,” Phillip Picardi, Teen Vogue’s digital editorial director, reminded us. “We’ve seen an immense resonance of political coverage with our audience.” Seventeen and ELLE have also capitalized on wokeness, their spon-con sharing real estate with social justice reporting, blurring the boundaries between protesting and shopping. “The inner workings of those places are not about feminism,” says Zeisler. “They’re about selling feminism and empowerment as a brand and that’s very different from what you would find at Rookie or at The Toast or The Hairpin.”
It seems fitting that a new print teen magazine launched last year called Teen Boss. On the fact that it had no ads, Jia Tolentino side-eyed in The New Yorker, “unless, of course, it’s all advertising — sponsored content promoting “Shark Tank” and JoJo Siwa (both appear in each of the first three issues) and also the monetizable self.”
Teen girls are the “giant piggybank of capitalism,” says Zeisler, and it’s an apt metaphor. Their value is their purchasing power and they are sacrificed, smashed to pieces, to get to it. When Ariana Grande obliterates every sales record known to man, man still asks why she is on the cover of BuzzFeed. Man never seems to ask, however, why sports — literal games — are on the cover of anything. This is the world in which Rookie and Lenny Letter and The Hairpin and The Toast attempt to survive, in which all that is left when they don’t are floating communities of women, because the industry refuses to make room. As Gevinson wrote, “that next iteration of what Rookie stands for — the Rookie spirit, if you will — is already living on in you.” As Dunham wrote, “Lenny IS you: every politician, every journalist, every activist, every illustrator, every athlete who shared her words here.” As The Hairpin wrote, “We hope when you look back on what we did here together it makes you proud and not a little delighted.” As Cliffe and Ortberg wrote, “The Toast was never just a chance for people to tune in to The Mallory and Nicole Show, it was also a true community and it will be missed.”
These publications did not die by their own hand. Zeisler notes that to this day, she sees people tweeting about missing The Toast. These sites died because their inherent value did not translate into monetary value in a capitalist system run by men who only know how to monetize women by selling them out. As bright and as hungry as young women are today, they are entering a world designed to shut them down. And the future looks bleak. “If media as an industry doesn’t figure out how to value [independent sites for young women] in a way that really reflects and respects the work that goes into them,” says Zeisler, “we’re just going to have a million fucking Bustles.”
* * *
Soraya Roberts is a culture columnist at Longreads.
Katy Kelleher | Longreads | December 2018 | 14 minutes (3,822 words)
In the Ugly History of Beautiful Things, Katy Kelleher shines a light on the dark underbellies of the things we adorn ourselves with. Previously: the grisly side of perfume.
* * *
In 2013, PETA released a video that changed the fashion industry. The footage, which is still available on YouTube, showed a man sitting on a bench, straddling a white rabbit that had been stretched out lengthwise and strapped down. It’s an angora, a rabbit breed prized for its long, thick, hollow-haired coat. The man begins to grab fistfuls of the rabbit’s soft fur and pulls it quickly, jerkily, tearing it from the rabbit’s flesh. As the video continues, you see more clips of rabbits being stripped naked to their pink skin. They look flayed and raw, and they cry out in pain. When I watched the video, the animal bleats disturbed my two dogs, who began running in circles, sniffing the air and wondering. I’m not sure if they were inspired to hunt, or if they could just smell my distress.
“They were the screams heard round the world,” proclaimed the the animal rights organization’s website. The copy accompanying the video is triumphant, notwithstanding the stomach-churning nature of the clip: “When PETA Asia released its shocking eyewitness video footage showing that workers violently rip the fur out of angora rabbits’ writhing bodies, customers shared the video widely, vowed never to wear angora again.” After this PR disaster, retailers began pledging publicly to stop using angora wool in their products. International clothing giants like H&M, ASOS, and Gap, Inc. informed customers that they would no longer offer angora products, while unsurprisingly remaining silent on their use of exploitative labor practices to produce their disposable fashion. The pain of sweet, fluffy bunnies was a bridge too far.
I’m glad corporations are being pressured to reexamine their policies around animal products. It is disturbing to witness animal suffering, and the rabbits’ squished and feral faces, their bright-white fur, their long ears, their pink mouths — all these characteristics makes it somehow worse. It doesn’t help that I had a collection of stuffed rabbits as a child; I liked to sleep surrounded by a ring of watchful plastic eyes and alert velvety ears. Like most children, I was a proto-animist, and in my primitive system of worship rabbits reigned supreme.
And yet: I own an angora sweater, made from real rabbit hair fibers. It is silky soft, and when I wear it, the appearance of my torso is elevated by the halo effect (called a “bloom”) created by thousands of tiny fibers poking through the tight weave. It makes me look a bit fuzzy and faded, like a ’60s movie star seen through a Vaseline lens. It is so soft, so light, so beautiful. I didn’t know when I bought it that angora wool came from mistreated rabbits. But I could have guessed. Most lovely things have a higher moral price tag than we like to admit.
* * *
The use of wool in clothing may date as far back as 7000 BCE. For much of that history, fabrics and knits were made from fibers harvested from sheep or goats. In 1993, archeologists found a piece of linen cloth from a site in Cayonu, Turkey. “It is not certain when people first began to weave animal fibers,” wrote John Noble Wilford for the New York Times. “It is likely that wool would have been used for weaving almost as early as flax was, but wool decays more readily than linen and so is not preserved in early archeological sites.” We know that humans had domesticated sheep and goats by this time, and it is believed that our distant ancestors were herding them for food. It is possible, and perhaps likely, that early humans were creating woven textiles from animal products some 7,000 years before Jesus Christ walked the earth.
Wool is a very sensible material, and not a very sexy one. It is naturally insulating, water-repellant, and durable. Rabbit hair sounds far more exotic than wool, and its function is slightly more decorative than sheep’s fleece. But “wool” is a bit of an umbrella term. Sometimes it refers to rabbit hair, sometimes it refers to lamb’s wool (sheared from the first coat of a newborn) and sometimes it refers to fleece from a goat or an alpaca. Sheep’s wool is the most common type, and even then it’s often broken down by providence. No matter what animal it comes from, one of the most important ways of gauging wool’s worth is by measuring the diameter of the follicle. A Shetland sheep has hair that is 23 microns thick, on average. Goat fiber under 19 microns thick is considered “cashmere” (sometimes this comes from Cashmere goats, but not always). Rabbit hair is even finer than this, and rings in at 11 microns.
I didn’t know when I bought it that angora wool came from mistreated rabbits. But I could have guessed. Most lovely things have a higher moral price tag than we like to admit.
Aside from its minuscule size, rabbit hair has other textural benefits. The fibers that come from angora rabbits are long, silky, and hollow. The scales on their surface form an interlocking chevron pattern, which makes them both harder to work with (less friction to grip other fibers) and more desirable for certain garments (the aforementioned halo effect, made when the fibers slip from their weave). Most importantly, angora feels different from wool. Anyone who has purchased an Icelandic wool sweater knows that, while warm and cozy and oh-so-hygge, thick-knit wool sweaters are itchy against naked skin and smelly when wet. Angora sweaters are fluffy and lightweight. A lobsterman pulls on a thick sheep’s wool sweater; a Hollywood ingénue dons an angora knit.
While weaving wool dates back to early civilization, sweaters didn’t begin to show up on the torso-cladding scene until the 15th century. The earliest knitted wool shirts came from the British islands of Jersey and Guernsey. The sweater as we know it was most likely invented by an anonymous fisherman’s wife, seeking to keep her breadwinner alive as he braved the freezing waters of the English Channel day in and day out, and for centuries it was most closely associated with workingmen and soldiers. Women, particularly high-class, fashionable women, did not wear sweaters. While there are examples of creatively patterned and aesthetically pleasing sweaters from before the Industrial Revolution, these pieces were attractive in the same way that folk art is beautiful: They look cool today, but weren’t considered chic or classy by the tastemakers of the day.
The sweater as a fashion item was Coco Chanel’s creation. The French designer famously MacGyvered the first modern women’s cardigan prototype out of a men’s crew-neck sweater. The neck hole was too tight to pull comfortably over her head, so Chanel took a pair of scissors and cut it down the front. She added ribbons to hide the raw edges of the wool, and began wearing it out and about. People went crazy for the new style, and soon everyone was copying Chanel.
The history of angora in fashion is inextricably linked to the history of the sweater. Angora sweaters became popular in the 1920s, more than 200 years after European sailors first brought angora rabbits from Turkey, where the breed originates, to France, where they were raised as livestock and kept as pets. While many kept rabbits for their meat and fur, angora rabbits were also popular companions for 18th century aristocracy. Legend has it that Marie Antoinette kept a fluff-themed menagerie, and various blogs have proclaimed her fondness for Maine Coon cats, Bichons, and white rabbits. (Historians have only been able to document the existence of several Papillons, so the rest may stem from Sofia Coppola’s 2006 pastel-washed movie.) For the most part, angora rabbits in Europe and America were slaughtered for their pelts rather than sheared for their fibers, but that changed around the turn of the 20th century, when sweaters became “a fashion item for women” in a way that they never had been before, according to fashion historian Jonathan Walford. In an email, he wrote:
As women became more active in sporting activities—hiking, cycling, swimming, even hockey—the sports sweater became a favorite, and quickly moved into fashion, most often as a cardigan, The Great War promoted the art of knitting as a way for civilian women to do their part by making soldiers and sailors mittens, scarves, sweaters, and balaclavas.
Furthermore, the 1920s saw a shift in women’s knitwear toward lightweight, clingy styles designed to accentuate curves, a trend that Walford says came in response to the “otherwise shapeless silhouette” of the era. The flapper dress hung loose over breasts and thighs, obscuring the waist and turning the body into a column of fabric. A well-chosen sweater could combat this. Sweaters looked more fresh and modern than nipped-waist dresses or corsets, and aligned neatly with the androgynous appeal of the flapper look.
By the 1930s and 1940s, angora was more popular than it had ever been before. It was recognized for its silky beauty and its utility, and prized for its thermal qualities and its tactile appeal. The fiber was particularly popular with two influential groups of the 20th century: Hollywood starlets and Nazi officers.
* * *
The term “sweater girl” described a particular type of Lolita-esque sexpot. The sweater girl was a study in contradictions — or the epitome of the Madonna/whore dichotomy — who was simultaneously big-breasted and womanly, and innocent and childlike. Hollywood publicists first coined the phrase to describe Lana Turner, who played a sweater-wearing teenage murder victim in the 1937 film They Won’t Forget. In the movie, 16-year-old Turner is bombshell beautiful, and her tight sweaters (paired with equally tight pencil skirts) accentuate her hourglass waist and prominent breasts. In Life magazine, screenwriter Niven Busch wrote that Turner “didn’t have to act” much, for her scene “consisted mostly of 75-ft. dolly shot of her as she hurried along a crowded street in a small Southern town. … She just walked along wearing a tight-fitting sweater. There was nothing prurient about the shot but the male U.S. found it more stimulating than a year’s quote of chorus girls dancing in wampum loin cloths.”
This was also an era when “breast fetishism” was on the rise. Women had begun wearing pointy “bullet bras” that exaggerated their shapes, turning naturally pillowy and pliable breasts into hard conical hills. A sweater paired with a bullet bra was the perfect combination of hard and soft, innocent and sexy, curvy and contained. Even though Turner was underage, it seemed permissible to lust after her, for she embodied a certain wholesome sex appeal that spoke to mid-century American audiences. “Maybe [Turner] didn’t look like the average high-school girl,” wrote Busch, “but she looked like what the average high-school boy wished the average high-school girl looked like.” Turner’s slightly risqué look resonated with women as well as men. There was a simplicity to this fashion — it was easy to replicate the sweater girl look. It was accessible and utterly American. (Busch also notes that the only person “profoundly shocked” by the audience reaction to her body was Turner herself, who began to “bitterly oppose” her sweater girl name, and for the years following her debut film, the starlet refused to wear tight-fitting knits on camera.) Following Turner’s splash as a glamorous dead girl, starlets like Jayne Mansfield and Jane Russell began adopting the style and by the 1940s and 1950s, the sweater girl was one of the more persistent tropes in American media. Walford notes that director and artist Ed Wood “always” wore angora as part of his drag. “Fit would be part of the reason,” Walford says, “because they would fit his male form better than women’s blouses, but touch was also at play. Angora has a sensual touch, like silk, camel hair, leather or rubber — all materials that have fetishistic followers.”
While wide-eyed actress in Hollywood were squeezing their torsos into fuzzy tops, soldiers in Germany had begun a focused series of experiments designed to test the long-term viability of raising angora rabbits for their hollow hairs. Angora appealed to the Nazis for several reasons. First, it had a sense of glamor to it — the fabric was associated with luxurious evening wear, and the Nazis were acutely aware of the importance of presentation and fashion (hence the continued fascination with “Nazi chic”). Secondly, angora was ideal for lining pilot’s jackets, since it was thin, water-repellant, warm, and unlikely to cause itching in the cold cockpit. They also planned to use it for sweaters, socks, and underwear — all garments that would lie close to the body and keep soldiers warm and dry while they were trekking across the Ukrainian steppe to wage war on the Eastern Front. In 1943, SS officers created a photo album to document the work they were doing at Dachau. The volume contains approximately 150 mounted photographs, maps, charts, and hand-lettered texts. There are pictures of rabbit hutches (which Stassa Edwards at Atlas Obscura calls “sanitary, modern”), descriptions of their feeding schedule, and instructions for feeding, shearing, and grooming rabbits. This album was “some of the last remaining evidence of Project Angora,” Edwards writes, “an obscure program begun by Himmler for the purpose of producing enough angora wool to make warm clothes for several branches of the German military.”
By 1943, Project Angora had been underway for two years, and workers had bred nearly 65,000 rabbits and created more than 10,000 pounds of wool. Few examples of these military textiles survive. But Project Angora isn’t notable for its material output or its influence on clothing or fashion, but rather the cleanliness of its wards, the purported humanity of it all. The rabbits housed at German concentration camps were kept in large hutches. They were fed well and petted routinely. SS officers bonded with the animals. Singrid Schultz, the reporter who uncovered the notorious photo album in 1945, described the cruel irony of the project:
In the same compound where 800 human beings would be packed into barracks that were barely adequate for 200, the rabbits lived in luxury in their own elegant hutches. In Buchenwald, where tens of thousands of human beings were starved to death, rabbits enjoyed scientifically prepared meals. The SS men who whipped, tortured, and killed prisoners saw to it that the rabbits enjoyed loving care.
The Nazis didn’t see humans as equivalent to rabbits or rats or other mammalian creatures — they had sympathy for animals and valued their welfare. That was part of their mythology; it was important to Himmler that the German people viewed the Nazis as progressive when it came to animal rights. “The thesis that viewing others as objects or animals enables our very worst conduct would seem to explain a great deal,” wrote Paul Bloom in the New Yorker. “Yet there’s reason to think that it’s almost the opposite of the truth.” According to Bloom, the focus on shame and humiliation reveals that Nazis (and other racist groups) don’t use the language of the zoo to excuse their actions or annul their guilt. They don’t imagine people as animals so that they can hurt them more easily. Rather, their tortures are explicitly designed to highlight their humanity. “The sadism of treating human beings like vermin lies precisely in the recognition that they are not,” Bloom argues.
The very same Nazis who were torturing and brutalizing the Jewish people in the camps were also posing with rabbits, brushing them, and snuggling them. They were capable of offering mercy to living creatures, and they were equally capable of acting out their sadistic fantasies on other people. At Project Angora, sadism lived next-door to tenderness, and I can’t think of anything uglier than that.
* * *
On a rainy Sunday in July, I visited the Kerfluffle Fiber Farm in Lebanon, Maine, which raises alpacas, sheep, and angora rabbits for their wool. I walked among the rabbit hutches and held a Satin angora rabbit named Sweetie Pie and felt her small heart beat against my fingertips. Unlike the farms in the PETA videos, at Kerfluffle, the rabbits are not squished into cages to tremble and squeal and wait for their next brutal shearing. Yes, they live in cages, they tremble, and they are (sometimes) sheared. But though the same words can be used to describe their basic conditions, the substance is completely different. The family farm is sprawling and green, with children’s toys strewn about the lawn. The rabbit cages are housed in an old horse stall in the wooden barn. Each rabbit has enough space to move around — they can hop and play and defecate and feed without contaminating their food or making a mess of their space. The rabbits are clean and well-groomed. I don’t see any oozing sores or open wounds and the hair is never ripped from their bodies, but harvested through brushing. I hear no screams, only the sounds of geese cackling and goats bleating. As I stroke my hands down the back of the angora, I can feel how easily this fur could be removed. There is no need to yank — it comes out naturally, long white fibers sticking to my sweaty palms before blowing away on the humid summer wind like dandelion seeds.
Mandy McDonald, certified fiber sorter and owner of Kerfluffle Farm, began keeping rabbits years ago. She was a lifelong knitter on a continual quest to find the best yarn, eventually choosing to raise angora rabbits because they were more affordable than alpacas or sheep. But even though it’s possible for a dedicated knitter to raise enough rabbits to make a scarf, it is difficult to reproduce this type of humane animal husbandry on a large scale. “New England used to be the mecca of textile manufacturing in the early 1900s,” McDonald says. “But now we don’t have the type of economy where we could raise our own fiber and make a living off it.” It’s impossible to compete with the fibers from overseas, though McDonald does manage to sell some of her knitted wares, like baby bonnets and scarves. “They’re heirloom gifts,” she says.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
“Heirloom gifts” is a sweet and marketable way to phrase it. In reality, angora fur may simply be “incompatible with industrial capitalism,” writes Tansy Hoskins for The Guardian. “In this sense it should be a scarce fabric, rather than something cheaply produced.” She notes that the Chinese angora farms like the ones documented by PETA have all but killed angora production in the U.K. Out of the 3,000 tons produced each year, 90 percent comes from China, according to the International Wool Textile Association. And while there’s growing support for animal welfare laws in China, there are still few laws protecting animal rights and no nationwide laws that explicitly prohibit mistreatment of animals.
But sales of angora wool have decreased since PETA released its disturbing video. In 2010, China exported $23 million worth of angora rabbit wool, according to the International Trade Center, and in 2015 that number was down to $4.3 million. The Business of Fashion also reports that “countries with cottage industries in angora — including the U.K., France, Italy, and Germany — have also seen exports decrease.” Italy, a major angora consumer thanks to their famous fabric mills, has seen a 77 percent decrease in angora imports.
There are many stories about brands pledging not to use rabbit fur but very little information available about how the Chinese angora industry has changed — which leads me to suspect that it hasn’t. Instead of buying pricier humane angora, retailers have simply stopped using the stuff altogether; it’s simply too expensive for cheap-chic spots like H&M and too obscure to be a true status material for higher-end brands. It’s also worth noting that China isn’t alone in their cruel treatment of these skittish creatures. In 2016, a French animal rights group went undercover at an undisclosed location in France to document similarly inhumane treatment of angora rabbits, including animals that had been exposed to extreme temperatures and plucked so indiscriminately that even their genitals were covered with painful scabs.
In order to harvest angora on a large scale and make it affordable for the average person, it seems inevitable that animals will be harmed. Raising angora the way that McDonald does would drive the prices up so high that few could afford the fabric. A set of mittens from Ambika, a New York–based independent designer whose website touts their humane treatment of rabbits and their solar-powered facilities, will set you back $260, and a cardigan-style coat costs a cool $2,175. The jacket is gorgeous, a white frothy confection made from 100 percent angora rabbit fiber, but the price tag means that this item will forever be beyond my reach. (There has never been a large angora industry in the United States, though plenty of farmers raise angora rabbits for fun or profit. People eat the meat, harvest the fur, and even breed them as show animals; the truly dedicated breeders head to Palmyra, New York, for the National Angora Show, an event the New York Times calls the “Westminster for Angoras.”)
Despite the fact that there are few economic benefits of raising rabbits, McDonald continues to raise fiber animals, including alpaca and sheep, because she loves the act of caretaking. “It makes me feel alive to nurture an animal,” she says. “And I love soft and fluffy things.” Angora is soft and silky, luscious and sensual. It’s also the product of an adorable animal, a creature that looks like an animated cloud puff. A contradiction in a sweater.
* * *
Rabbits are cute, and like most cute things, they make us want to hold them close and squeeze them, protect them from harm, bond with them. This is a visceral emotion, one that can look a little like love if you stand at a great enough distance. Even a Nazi can recognize the cuteness of an angora rabbit, stroke its wispy hair, feel its soft pink paws, and even a Nazi can think, somewhere in his monstrous mind, that this is a creature that does not deserve to suffer. This impulse can look like kindness — but it isn’t, not truly. Kindness and compassion are more complicated than protectiveness, and harder to embody. When we boycott sweaters made from abused animals yet fail to extend the same outrage to clothes made in sweatshop conditions, we’ve falling prey to the dark side of cuteness. When we break women down into individual pieces, breasts and arms and fluffy torsos, we fail to see the whole human, the sensitive teenager behind the sexpot. Cuteness narrows our vision, making it difficult to see the greater picture. Pull a thread long enough and the entire system unravels, revealing the underground abuse woven into our wardrobes and culture.
* * *
Katy Kelleher is a freelance writer and editor based in Maine whose work has appeared in Art New England, Boston magazine, The Paris Review, The Hairpin, Eater, Jezebel, and The New York Times Magazine. She’s also the author of the book Handcrafted Maine.
Editor: Michelle Weber
Factchecker: Sam Schuyler
Copyeditor: Jacob Z. Gross
Katy Kelleher | Longreads | September 2018 | 15 minutes (3,859 words)
If given the choice to smell like whale excrement or delicate white flowers, few people would chose the first option. Bile, feces, vomit, and animal oils sound as though they would smell repulsive. The words conjure up scent memories of that time your dog released his anal glands on the duvet, or that summer you worked by the wharf and the August air was thick with the miasma of oily herring heads. Jasmine, on the other hand, sounds like a love song, a Disneyfied dream. Try, right now, to imagine the smell of blooming jasmine. Your memory, ill-equipped to locate scents in its baroque filing system, might pull up something syrupy sweet or softly floral. Is that how you want your body to smell?
Too bad: if you choose door number two, you’ll walk away reeking of sharp vegetal tones tempered by a slightly earthy, foul scent. Jasmine absolute is an oily, semi-viscid, dark amber fluid that is denser and more concentrated than jasmine essential oil. Essential oils come from distilled, boiled, or pressed plant matter, while absolutes are traditionally made through a processed called enfleurage, which involves submerging the delicate blossoms or spices in fat before extracting their fragrance molecules into a tincture of ethyl alcohol. While it’s a common ingredient in a natural perfumer’s tool kit, jasmine absolute smells strange: complicated, beautiful, not entirely pleasurable. It reeks of indole (rhymes with “enroll”), an organic chemical compound also found in coal tar, human feces, and decomposing bodies.
If you choose door number one, you’ll be blessed with the kiss of ambergris, a highly desirable natural substance that smells sweet yet rather marine, like vanilla and unrefined sugar mixed with seawater. The scent reminds me a little of the smell of my dog’s paws — pink and light and animal. It smells like cashmere feels. Smelling ambergris is an innate pleasure, one that even an infant would recognize as enjoyable, like the first sip of sweet milk.
For more than a thousand years, humans have been adorning our bodies with animal products like ambergris and putrid-smelling plant derivatives like jasmine absolute. We apply off-putting materials to our bodies to enhance and mask our natural scents. Like dogs that roll in deer carcasses, humans seek to change our olfactory emissions by borrowing from other creatures. It’s not always about simply smelling good: We want to smell complex, so that others will be compelled to keep coming back, like bees to a flower, to sniff us again and again, to revel in our scents, and draw ever closer to our warm, damp parts.
According to natural perfumer Charna Ethier, ambergris can smell like “golden light” or a “flannel shirt that has been dried on a clothes line on a warm summer day.” Although there are several types of ambergris (including gray, gold, and white), Ethier is referring to her own personal sample, which she characterizes as “soft, fresh, and ozonic.” Ethier is the owner of Providence Perfume Company in Rhode Island, and inside her well-stocked cabinet of olfactory curiosities, she keeps a single bottle of the precious stuff. Next to her 100-year-old cade oil (a foul-smelling liquid made from juniper trees, purchased at an estate sale) and below her collection of floral absolutes and herbal essences, she has stashed a bit of ambergris tincture. The clear glass vial contains a mixture of ambergris and alcohol that includes just 5 percent whale matter. In its pure form, this substance is a waxy gray ball of animal secretion, a floating fat-berg that is “more expensive than gold.” Unlike jasmine absolute, which plays a role in many of her perfumes, real ambergris is simply too expensive to use in a commercial product. “It’s considered the miracle ingredient for perfumes,” she says. “It makes everything better.”
It’s not always simply about smelling good: We want to smell complex, so that others will be compelled to keep coming back, like bees to a flower, to sniff us again and again, to revel in our scents, and draw ever closer to our warm, damp parts.
Ethier doesn’t use any synthetics in her perfume, nor does she use animal products, though animal scents are a traditional ingredient in perfumery. Not only are these compounds expensive, but true mammalian products like musk, civet, and ambergris often come at a cruel cost. Whales have been murdered for their oily blubber and concealed stomach bile, civets are caged and prodded for their fear-induced anal gland secretions, and musk is harvested from the glands of slaughtered deer. Many people know that perfumers build their trade on the graves of millions of tiny white flowers, but fewer people realize they also bottle and sell the byproducts of animal pain and suffering. Perfumers who use synthetic materials are exempt, in a sense, as are those who use found or vintage materials. Ethier’s ambergris is “quite old” and reportedly beach-found (“I hope it is,” she says). But even perfumes that use synthetic compounds or salvaged bile carry the whiff of death; the history of the industry is seeped in it, and that smell doesn’t wash out easily.
There’s a reason perfumers use these notes. They enhance the floral scents, undercutting lightness with a reminder of darkness. Animal products are the antiheroes in this drama — even when you hate them, you still, just a little, love them. That’s how siren songs work, and ambergris sings the loudest. Once, Ethier made a perfume using her most prized ingredients. She mixed 100-year-old sandalwood essence with ambergris tincture and frangipane and boronia absolutes, two flowers native to Central America and Tasmania, respectively. It was the first time she’d used ambergris, and this one-off perfume was so lovely that “it was like gold-washing something.” She remembers wistfully, “It was so beautiful.”
* * *
Smell is the most underrated and mysterious sense. In her 1908 autobiography, The World I Live In, Helen Keller called scent the “fallen angel.” “For some inexplicable reason, smell does not hold the high position it deserves amongst its sisters,” she wrote. Keller mapped her world by smell — she could smell a coming storm hours before it arrived and knew when lumber had been harvested from her favorite copse of trees by the sharp scent of pine. In contrast to touch, which she called “permanent and definite,” Keller experienced odors as “fugitive” sensations. Touch guided her; scent fed her. Without smell, Keller imagined her world would be lacking “light, color, and the Protean spark. The sensuous reality which interthreads and supports all the gropings of my imagination would be shattered.”
We don’t often think in terms of color and light when it comes to smell, perhaps because we have so few words for scent that we borrow from the lexicons of our other senses. Despite the fact that smell is our most ancient sense — our so-called “lizard brain” is also sometimes termed the rhinencephalon, literally the “nose brain” — it is also one that seems to elude language. “Smell is the mute sense, the one without words,” wrote Diane Ackerman in A Natural History of the Senses. “Lacking a vocabulary, we are left tongue-tied, groping for words in a sea of inarticulate pleasures and exaltation.” We’ve had eons to come up with words for the precise smell of fresh-turned earth or the exact scent of a blazing beach fire, and still the best we can do is earthy and smoky.
Perfumers have their own language, but their words have only recently begun to trickle down into popular culture through beauty magazines and blogs. Not only do perfumers and their superfans speak of absolutes, oils, and tinctures, but they can also rattle off compounds like coumarin and eugenol. A trained master perfumer (or “nose”) can pick out precise scents within a layered perfume. They don’t just call something foul — they can pick out the pungency of musk or the reek of tobacco, ingredients that are delicious in small doses but overwhelming when used out of balance.
In my quest to understand the appeal of seemingly repugnant ingredients, I spoke with doctors who study the nose, perfumers who feed the organ, and even a zookeeper who spends her days breathing in the pure, undiluted scent of civet discharge. While they had various theories as to why darkness seems to be an essential element of beauty, they all agreed on one thing: It’s all about context. In the right context, even the smell of death can be appealing. In the right context, vomit can be more desirable than gold. In the right context, with the right music playing in the background, you begin to root for the glamorous hit woman or the sardonic drug dealer.
They also agreed that sex is part of this equation, and it’s the easiest explanation to trot out. But perfumery is also about more than just smelling nice and attracting a mate. It’s about aesthetics, taste, and desire in a more general sense. We want to smell intoxicating, and truly intoxicating things are often a little bit nasty — they have an edge that cuts deeper than simple sensory pleasure. And despite how it may seem, encounters with the beautiful are rarely entirely enjoyable. If that were the case, Thomas Kinkade’s light-dappled cottages would be considered the height of fine art, and we would all walk around misted lightly with synthetic jasmine and fake orange blossom. Instead, we adore the luscious gore of Caravaggio’s canvases and dab our pulse points with concoctions containing the miasma of swamp rot, the cloying smell of feces, and the pungent, tonsil-kicking fetor of death. Beauty is sharp, it is intense, and it comes at a cost. Just as desire and repulsion walk through the same corridors of our minds, so too do beauty and destruction move hand in hand. Whenever you find something unbearably beautiful, look closer and you’ll see the familiar shadow of decay.
* * *
One of the first known perfumers in history was a woman named Tapputi-Belatekallim. According to clay cuneiform tablets dating back to 1200 BCE, Tapputi lived in ancient Babylon and likely worked for a king. The second part of her name, “Belatekallim,” indicates that she was head of her own household, in addition to holding a valued position at court. Thousands of years before the advent of the “SheEO,” Tapputi was leaning in and bossing around underlings. She was a master of her craft, and recognized as such by her peers. Much of what we know about her comes from secondary sources, but the process of distilling and refining ingredients to produce a fragrant balm — oil, flowers, water, and calamus, a reed-like plant similar to lemongrass — is described on surviving clay tablets. It’s miraculous how modern her scents seem — or rather, it’s surprising how little has changed. Tapputi used scent-extracting techniques like distillation, cold enfleurage, and tincture that natural perfumers still use today. She also mixed grain alcohol with her scents, creating perfumes that were brighter, lighter, and had more staying power than anything else available at the time. These scents may have played a religious role in ancient culture, but they may have simply been another way to prettify the body and please the senses.
Beauty is sharp, it is intense, and it comes at a cost.
Unfortunately, Tapputi’s story is a fragmented one — she’s possibly the first female chemist, and yet she’s been lost to history. There is much more evidence available about the perfumes of ancient Egypt, Persia, and Rome. In 2003, archeologists unearthed the world’s oldest known perfume factory in Cyprus. Archaeologists theorize that this mud-brick building and the perfumes it produced caused Greek worshippers to begin associating the island with Aphrodite, the goddess of sex and love. (Born from the magical remnants of the sky god’s testicles, which had been separated from his body and cast into the sea by Cronos, the Titan god of harvest, Aphrodite supposedly walked from the foaming waters of the sea and onto the beach at Paphos, an ancient settlement located on the southern coast of the island.) Analysis of the material found on-site revealed that these ancient perfumers were using plant-based ingredients like pine, coriander, bergamot, almond, and parsley, among others.
These perfumes all sound rather pleasant, don’t they? I can imagine dabbing almond oil mixed with a bit of bergamot on my wrists, catching a botanical draft of scent here and there as I move. It seems terribly obvious that people may want to smell like plants. Some of the earliest pieces of art represent flowers, leaves, and trees. Studies have shown that we crave symmetry on an unconscious level, and we’re drawn to color, so it makes perfect sense that flowers would hold our attention with their Fibonacci spirals and vivid hues. I can even understand why curiosity might compel someone walking along a beach to pick up a chunk of marine fat and sniff it. It’s a bit harder to understand the moment when medieval perfumers made the conceptual leap from smelling the glandular sacs of dead musk deer to dabbing it on their pulse points. Yet at some point, this must have happened, for starting after the Crusades, Europeans became obsessed with musk.
Like many prized spices, fabrics, and luxury items, musk came to Europe from the Far East. Derived from the Sanskrit word for testicle, “musk” refers to the glandular products of small male Asian deer. These little sacs of animal juice were harvested from the bodies of slain deer and left to dry in the sun. In its raw form, musk smells like urine, pungent and sharp. But after being left to dry, musk develops a softer scent. The reek of ammonia fades, and it becomes mellow and leathery. It stops smelling like piss and begins to smell like fresh sweat, or the downy crown of a baby’s head. It gained a reputation as an aphrodisiac; according to some legends, Cleopatra used musk oils to seduce Mark Anthony into her bed. The size of musk molecules also contribute to its perfume popularity: Larger molecules oxidize slower, so musk’s comparatively large molecules last longer than other odors and allow it to extend the life of other scents. Its fixative property means musk is a base note in many perfumes, even ones that don’t smell overtly musky.
In 1979, musk deer were listed as an endangered species by the Convention on International Trade in Endangered Species of Wild Flora and Fauna (CITES), so it’s no longer legal to use natural musk in commercial perfumes. However, Tibetian musk deer are still killed for their glands, and a brisk trade in poaching has resulted in some illegal musk showing up online. Musk is also used in some traditional Chinese and Korean remedies, which helps the substance remain one of the most valuable animal products on earth. In his book The Fly in the Ointment, Joe Schwarcz, director of the McGill University Office for Science and Society, points out that musk is “more valuable than gold.”
Civet is a more unknown fragrance, though it also appears frequently in perfumes. Made from the glands of a mammal that shares the name of the scent, civet is similar in structure to musk on a molecular level but smells even more animalistic, according to people who have actually sniffed it. “They have a general odor about them that is very pungent,” says Jacqueline Menish, curator of behavioral husbandry at the Nashville Zoo. Civets are uncommon zoo creatures. They are neither felines nor rodents, though they’re commonly mistaken for both. Although few visit the zoo just to glimpse these odd little nocturnal creatures, the Nashville Zoo has several banded palm civets because the zoo director “just loves them.” (You may have heard of civet coffee, a product made by force-feeding Asian palm civets coffee beans, then harvesting them from their poop. Society, it seems, has come up with several odd ways to make money from civet asses.) When they are startled, frightened, or excited, civets “express” their anal glands, and the greasy liquid “shoots right out.” The scent hangs in the air for days. “I guess I could see if it was diluted it might not smell as offensive,” Menish concedes. “But it can be really bad if it hits you.”
Unlike musk, civet can be collected without killing the animal, but it’s not a cruelty-free process. Civets are kept in tiny cages and poked with sticks or frightened with loud noises until they react and spray out their valuable secretions. Commercial perfumers no longer use genuine civet in their fragrances, but James Peterson, a perfumer based in Brooklyn, owns a very small vial of civet tincture. “It smells terrible when you first smell it,” he says. “But I have some that is five years old, and it gets this fruity quality as it ages. In a tincture, it gets this rich scent that works wonderful with florals.” On a few occasions, Peterson has used genuine musk or civet to make “tiny amounts” of specialty perfumes, and the resulting blends have an “intensely erotic draw.” Customers report that these dark and dirty smells are potent aphrodisiacs. “When it’s below the level of consciousness, that’s when it works best,” he adds.
The reek of ammonia fades, and it becomes mellow and leathery. It stops smelling like piss and begins to smell like fresh sweat, or the downy crown of a baby’s head.
Like musk and civet, ambergris comes from an animal, but making it doesn’t necessarily involve murdering whales. Whales have historically been killed for their bodily products, including their oil, spermaceti, and their stomach contents, but it’s more likely now that ambergris is beach-found since it is only produced by an endangered species, sperm whales. The waxy substance forms in the hindgut of a sperm whale to protect their soft interiors from hard, spiky squid beaks. According to Christopher Kemp, author of Floating Gold: A Natural (and Unnatural) History of Ambergris, ambergris begins as a mass of claw-shaped horns that irritate the whale’s digestive systems. As the mass gets pushed through the whale’s hindgut, it grows and slowly becomes “a tangled indigestible solid, saturated with feces, which begins to obstruct the rectum.” Once it passes into the ocean, it begins to slowly mellow out. The black, tar-like wad is bleached by the ocean until it becomes smooth, pale, and fragrant. It ranges in color from butter to charcoal. The most valuable ambergris is white, then silver, and finally moon-gray and waxy. It’s believed that only 1 percent of the world’s sperm whale population produces ambergris. It’s very rare, very bizarre, and very valuable.
The human appetite for ambergris dates back to ancient times. The Chinese believed it was dragon spit that had fallen into the ocean and hardened, and the ancient Greeks liked to add powdered ambergris to drinks for an extra kick. King Charles II of England liked to eat ambergris with eggs, which was apparently a fairly common practice among the aristocracy in England and the Netherlands. It shouldn’t be surprising that people engaged in some light coprophagia — smell and taste are so deeply linked, and while I can’t attest to the taste of ambergris, I can say that it smells beguiling. Given the chance, I would sprinkle some silvery whale powder on my eggs, just to see what it was like. (It’s certainly no stranger than eating gold-coated chicken wings — another practice seemingly designed to destroy value by passing the desired object through a series of rectums until it reaches the inevitable white bowl.)
In perfume, ambergris is often used to boost other scents. It plays a supporting role rather than a starring one, for although the smell is fascinating, it isn’t very strong. It has an unearthly fragrance. It smells like the sea, but also like sweet grasses and fresh rain. It’s amazing that something made in the bowels of the whale could smell so pure. If you found fresh ambergris, midnight black and sticky and stinking, perhaps you wouldn’t want to eat it. But with distance and dilution, ambergris is transformed from animal garbage to human ambrosia.
* * *
Schwarcz’s book offers one reason why we’re drawn to these scents, citing studies that suggest people with ovaries be more sensitive to musk, particularly around ovulation. He cautiously speculates that musk might resemble chemicals produced in humans to attract potential mates.
Over the phone, he is even more wary of speculating about a possible evolutionary explanation for our fragrance preferences. “The sense of smell has been studied thoroughly with surprisingly little results in terms of what we actually know. It’s such a complicated business,” he said. “We don’t know why musk is more attractive to some people than others. We don’t know why it smells differently when it’s diluted, but we know that it does.” When I asked whether we like musk because we’re programmed to enjoy the smells of bodies, he was quick to turn our talk toward the “issue of pheromones, which “may not actually even exist at all” in humans, despite our desire to attribute various observed phenomenon to the invisible messengers. According to Schwarcz, much of what the general population knows about pheromones only applies to certain nonhuman species. For instance, boar pheromones are well understood, easy to replicate, and used by farmers to increase the farrowing rate amongst their stock. Some of the perfumes that boast “real pheromones,” like Jovan Musk and Paris Hilton’s eponymously named scent, may contain pheromone molecules — ones that pigs would find very enticing.
But where science fails to offer a satisfactory explanation, artists can step in, providing an illuminating tool to help understanding our relationship to desire and aesthetics. For perfumer Anne McClain, co-owner of MCMC Fragrances in Brooklyn, it is the tension between foul and sweet that elevates a fragrance from consumer product into the realm of art. This is key when it comes to repugnant ingredients, from indolic florals to musky secretions. The indecent element becomes a secret of sorts, a gruesome piece of marginalia scribbled alongside the recipe, visible to only those in the know but appreciated by all. The foulness whispers below the prettiness, and combined, these various elements create a scent that smells paradoxically clean and dirty, light and dark.
“Indole is what makes the scent of jasmine interesting,” she says. “It makes you want to come back and smell it again — it has an addictive quality to it.” Unlike citrus scents, which are one-note and rather simplistic, florals have an element of decay, a whiff of putridity. McClain rightfully points out that this is part of what makes flowers themselves attractive to bees and other pollinators. Corpse flowers famously smell like dead bodies, but so do many other blossoms, just to a lesser extent.
Plus, humans are by nature “just a little bit gross,” McClain says. Like civets, musk deer, and whales, we shit, we secrete, we mate, and sometimes we vomit. But we also give birth and create beauty, and for McClain, it’s this life-giving ability that links blossoms and humans. “I think there is a depth to anything that is made of life and creates life. There’s something inherently sexual in that,” she says. “Even though something like civet will smell gross on its own, it adds an element of reality.” When layered properly with other olfactory delights, this can create an evocative smell, one that you want to return to, to interrogate with your nostrils the same way you might pore over a painting. Through layering pleasure on top of disgust, perfumers can create something that resembles life — exquisite, fleeting, and mysterious.
* * *
Editor: Michelle Weber
Factchecker: Matt Giles
Copyeditor: Jacob Z. Gross