Category Archives: Nonfiction

The Dog Breeds Disappearing in India

Chippiparai dog at the streets of Tirupudaimarudur
Chippiparai dog at the streets of Tirupudaimarudur via Wikimedia

I wish I’d paid closer attention to breeds when I adopted Harley, a chihuahua mixed with something barky. I didn’t know that chihuahuas can be monogamous to a fault, making it hard on him when I’m not around. And I should have know better about the terrier influence. No matter how I try, he’s still shouty at everything that walks by the back gate.

Too late, too late, he’s very much my dog now, 12 pounds of shouty possessive loyalty on four legs. If I’d gone by breed characteristics alone, I’d have a very different dog sleeping next to the kitchen heater vent right now.

Doodles are so cheerful, retrievers are easy going, Australian shepherds are smart and good workers. No wonder they’re popular. But faddishness of dog variety has led to the decline of less common breeds. Scroll.in introduces us to indigenous Indian dog breeds who have lost favor in their home country as labs and shepherds become the companion of choice.

In ancient times, Indian dogs were prized across the world and exported in large numbers for hunting prowess – travelling as far and wide as Rome, Egypt and Babylon – but they were shunned at home. The international demand for Indian dogs and the fact that their gene pool stayed relatively undiluted till about three centuries ago kept these breeds going. Baskaran tells us that in the 18th century, a Frenchman travelling across India identified 50 distinct dog breeds including one called the Lut, which was often a fascinating shade of blue.

The Lut is just one of several dog breeds that have not been seen in living memory, writes Baskaran. Based on four decades of research and observation, the author concludes that there are just 25 indigenous Indian dog breeds found today. The reasons for this decline are vast and complex. During the colonial period, British rulers settling into India for the long haul often imported dogs from back home. The arrival of foreign breeds resulted in cross-breeding and there was little government interest in preserving indigenous breeds and trying to keep their gene pool intact. The few Indian rajas who did have dogs as pets were more drawn to foreign breeds. The only attempts to protect Indian breeds were made by British dog enthusiasts, who had taken a particularly fancy to our indigenous dogs, especially those found in the Himalayas.

They’re all good dogs.

Read the story

When to (Not) Have Kids

An employee of Planned Parenthood holds a sign about birth control to be displayed on New York City buses, 1967. (H. William Tetlow/Fox Photos/Getty Images)

For a variety of reasons, I don’t have kids. As a woman of a certain age, I’ve been conditioned to believe I must qualify that statement by assuring you it’s not that I’m some kid hater, or that I don’t think babies are cute. They are! (Okay, I also find them to be kind of disgusting.) But among my many reasons for not procreating is that kids grow up to be people, and life for most people on this overcrowded, overheated planet is hard, and getting harder.

Even before Donald Trump took office, I had often wondered: with terrorism, war, and genocide, with climate change rendering Earth increasingly less habitable, how do people feel optimistic enough about the future to bring new people into the world? Since the presidential election, the prospects for humanity seem only more dire. I’m hardly alone in this thinking; I can’t count how many times over the past year I’ve huddled among other non-breeders, wondering along with them in hushed tones, How on earth do people still want to have kids? I was surprised, at this bleak moment in American history, that I hadn’t seen any recent writing on the topic. Was it still too taboo to discuss not making babies, from any angle? Then this past week a few pieces caught my eye.

The one that spoke most directly to my doubts about perpetuating the human race, and its suffering, was “The Case for Not Being Born,” by Joshua Rothman at The New Yorker. Rothman interviews anti-natalist philosopher David Benatar, author of 2006’s Better Never to Have Been: the Harm of Coming Into Existence, and more recently, The Human Predicament: A Candid Guide to Life’s Biggest Questions. Rothman notes that Benatar makes no bones about his pessimism as it relates to humanity.

People, in short, say that life is good. Benatar believes that they are mistaken. “The quality of human life is, contrary to what many people think, actually quite appalling,” he writes, in “The Human Predicament.” He provides an escalating list of woes, designed to prove that even the lives of happy people are worse than they think. We’re almost always hungry or thirsty, he writes; when we’re not, we must go to the bathroom. We often experience “thermal discomfort”—we are too hot or too cold—or are tired and unable to nap. We suffer from itches, allergies, and colds, menstrual pains or hot flashes. Life is a procession of “frustrations and irritations”—waiting in traffic, standing in line, filling out forms. Forced to work, we often find our jobs exhausting; even “those who enjoy their work may have professional aspirations that remain unfulfilled.” Many lonely people remain single, while those who marry fight and divorce. “People want to be, look, and feel younger, and yet they age relentlessly. They have high hopes for their children and these are often thwarted when, for example, the children prove to be a disappointment in some way or other. When those close to us suffer, we suffer at the sight of it. When they die, we are bereft.”

While this isn’t how I always look at life, I believe Benatar makes some good points. (Not to mention I’ve endured three of the above mentioned hot flashes while writing this, and one’s optimism does tend to dip in those estrogen-depleted moments.)

Rothman’s piece reminded me of an essay we published here on Longreads a couple of years ago,  “The Answer is Never,” by Sabine Heinlein. Like me, Heinlein often finds herself having to defend her preference for choosing to be childless: “One of the many differences between my husband and me is that he has never been forced to justify why he doesn’t want to have children. I, on the other hand, had to prepare my reasons from an early age.” She keeps a laundry list of reasons handy:

Over the years I tried out various, indisputable explanations: The world is bursting at the seams and there is little hope for the environment. According to the World Wildlife Fund, the Earth has lost half of its fauna in the last 40 years alone. The atmosphere is heating up due to greenhouse gases, and we are running out of resources at an alarming speed. Considering these facts, you don’t need an excuse not to have children, you need an excuse to have children! When I mention these statistics to people, they just nod. It’s as if their urge to procreate overrides their knowledge.

Is there any knowledge forbidding enough that it could potentially override such a primordial urge? In a devastating essay at New York magazine, “Every Parent Wants to Protect Their Child. I Never Got the Chance,” Jen Gann attests that there is. Gann writes about raising a son who suffers from cystic fibrosis, an incurable disease that will likely lead to his early death. The midwife practice neglected to warn her that she and her husband were carriers, and Gann writes that she would have chosen to terminate the pregnancy if they had.

The summer after Dudley was born, my sister-in-law came to visit; we were talking in the kitchen while he slept in the other room. “But,” she said, trying to figure out what it would mean to sue over a disease that can’t be prevented or fixed, “if you had known — ” I interrupted her, wanting to rush ahead but promptly bursting into tears when I said it: “There would be no Dudley.” I remember the look that crossed her face, how she nodded slowly and said, twice, “That’s a lot.”

What does it mean to fight for someone when what you’re fighting for is a missed chance at that person’s not existing?

The more I discuss the abortion I didn’t have, the easier that part gets to say aloud: I would have ended the pregnancy. I would have terminated. I would have had an abortion. That’s firmly in the past, and it is how I would have rearranged my actions, given all the information. It’s moving a piece of furniture from one place to another before anything can go wrong, the way we got rid of our wobbly side tables once Dudley learned to walk.

Finally, an essay that took me by surprise was “To Give a Name to It,” by Navneet Alang, at Hazlitt. Alang writes about a name that lingers in his mind: Tasneen, a name he had come up with for a child when he was in a relationship years ago, before the relationship ended, childlessly. It reminded me of the names I long ago came up with for children I might have had — Max and Chloe, after my paternal grandfather and maternal grandmother — during my first marriage, long before I learned I couldn’t have kids. This was actually good news, information that allowed me, finally, to feel permitted to override my conditioning and recognize my lack of desire for children, which was a tremendous relief.

Reading Alang’s essay, I realized that although I never brought those two people into the world, I had conceived of them in my mind. And somehow, in some small way, they still live there — two amorphous representatives of a thing called possibility.

A collection of baby names is like a taxonomy of hope, a kind of catechism for future lives scattered over the horizon. Yes, those lists are about the dream of a child to come, but for so many they are about repairing some wound, retrieving what has been lost to the years. All the same, there were certain conversations I could have with friends or the love of my life, and certain ones with family, and somehow they never quite met in the same way, or arrived at the same point. There is a difference between the impulse to name a child after a flapper from the Twenties, or search however futilely for some moniker that will repair historical trauma. Journeys were taken — across newly developed borders, off West in search of a better life, or to a new city for the next phase of a career — and some things have been rent that now cannot quite be stitched back together. One can only ever point one’s gaze toward the future, and project into that unfinished space a hope — that some future child will come and weave in words the thing that will, finally, suture the wound shut. One is forever left with ghosts: a yearning for a mythical wholeness that has slipped irretrievably behind the veil of history.

Yes, I know those ghosts, but not the yearning. I suppose I’m fortunate to not be bothered by either their absence in the physical realm, nor their vague presence somewhere deep in the recesses of my consciousness. Fortunate to no longer care what my lack of yearning might make people think of me.

The New Face of Military Recruitment

AP Photo/Rogelio V. Solis

Recruitment rates are down, and while the Army works to increase the number of enlistments, it’s simultaneously working to eliminate past unethical recruiting practices. At Task & Purpose, Adam Linehan accompanies recruiters in New Jersey to see how the process works, and to meet the people who might one day form the next generation of American soldiers — if they can qualify.

Since the mid-aughts, when thousands of recruiters faced allegations of so-called “recruiting improprieties,” the Army has gone to great lengths to crack down on unethical recruiting practices — such as fudging paperwork, purposefully overlooking blatant disqualifiers, helping recruits cheat on the entrance test, and lying to enlistees (telling them, for example, “You’ll never go to war”). But the temptation to bend the rules persists, increasing whenever the pressure on recruiters to fill quotas becomes greater. That’s the case now.

“The problem is that the Army didn’t just increase the mission, they increased the demand for quality recruits,” a recruiter told me, speaking on the condition of anonymity. “So a lot of guys are cutting corners. Usually it’s just to keep their bosses off their backs — to avoid an ass chewing. It’s hard to flat-out lie when everyone has access to Google in their pockets, so they tell half-truths, which are still lies. Like, if a kid wants to join the reserve for college money, the recruiter will neglect to mention that the education benefits don’t kick in until a year after they sign their contract. That kind of stuff.”

However, among the East Orange recruiters, honesty isn’t just expected; it’s the foundation of their entire approach. In 2015, Lt. Col. Edward Croot, a Special Forces officer who commanded the Mid-Atlantic Recruiting Battalion until about five months ago, laid the groundwork for an ambitious strategy to reverse recruiting trends in the Northeast, which is the most challenging environment for recruiters in the country. Croot believed history was to blame: Over decades of dwindling participation in the armed forces, Northeasterners had grown vastly disconnected from the military. To mend the gap — to reacquaint people in the region with the organization fighting wars on their behalf — Croot opted for aggressive transparency. Recruiters would need to spend as much time as possible “outside the wire,” educating the masses about military service. In other words, they’d need to make the Army familiar.

Read the story

Being a Teenage Girl is Hard

For The California Sunday Magazine‘s “Teens Issue,” Elizabeth Weil wrote about her experience raising a teenage daughter. The piece is annotated by her own 15-year-old daughter, Hannah Duane.

Weil’s piece is poignant, heartfelt, and self-effacing. Duane’s annotations are, in a word, perfect. I say this as someone who was a teenage daughter, and who still is a little bit a teenage girl. (Maybe we are all still a little bit teenage girls, until we have to raise ones of our own? Or maybe forever? I don’t know; I haven’t yet faced the challenge.)

When Weil writes that her husband threw out his back while climbing, “pissing off” Duane, Duane’s strident annotation clarifies:

I was not pissed off. I’ve told my parents this multiple times. Nobody in my family can understand that I can be disappointed but not mad at a particular person. I was in a shitty mood. I have my own thoughts, OK?

PREACH, HANNAH. Grown women on Twitter announced they needed that quote blown up and framed on their walls. We have our own thoughts, OK?

Duane’s annotations made me laugh out loud and gasp in recognition. She annotates her mother’s praise of her climbing to point out that she actually hadn’t succeeded at a specific move she was trying. “It is a weird experience to have your parents praise you for something you believe you failed at,” she writes, pointing out that it “feels like you aren’t being listened to, or maybe you’re not explaining yourself well.”

She thanks her mother for not letting her run into traffic as a toddler, but later responds to her mother’s concern about being able to protect her with, “Parents underestimate kids’ ability to figure out what is right for them.”

My absolute favorite, though, is when Weil writes about the way teens revert to certain toddler behaviors, taking appalling risks and “throwing tantrums at horrible times.” Duane annotates:

I would like to make a defense of teen tantrums. They may be a little much to deal with, but after it’s over, I find that having had a freakout when you least want to can be liberating. You did the thing you dearly wished you would not do, and you lived. There’s comfort in having it out there.

Read the story

Welcome to Parliament! Bachelors Can Only Wear Brown Shoes Every Other Tuesday

Houses of Parliament at Westminster Palace and clock tower Big Ben. Photo by: Frank May/picture-alliance/dpa/AP Images

Charlotte Higgins‘ new story on the structural decay in the Palace of Westminster, seat of Britain’s Parliament, is a fascinating look at the Old Guard’s attempt to physically hold on to the vestiges of power. The building is crumbling and requires billions of dollars in repairs — and ideally, it would be empty during the multi-year process. But what changes politically when Parliament is removed to a more modern, inclusive space from one steeped in this kind of history?

Travelling around this strange land is a fraught business. One is constantly committing mysterious, minor infractions. It is like being in a country where the language is comprehensible, but the codes of behaviour are opaque. From the Central Lobby, for example, four corridors radiate. There is no sign to tell you that you cannot take the one that leads to the House of Commons: but if you accidentally stray there, you will get an imperious ticking-off from one of the Palace doorkeepers (59 are employed by the Commons, and 23 by the Lords). There have been doorkeepers here since the 14th century: dressed in white tie, they control the movements of others with punctilious energy. I was reprimanded for loitering “on the blue carpet” in the Prince’s Chamber, and for speaking in the Royal Robing Room, which is sometimes allowed and sometimes not. Doorkeepers are also sources of gossip, wit and speculative histories of the palace. One I met suggested disapprovingly that “Comrade Corbyn” would soon be selling off Pugin’s wildly over-the-top royal throne in the House of Lords “if he has his way”. Another told me that lions depicted on the floor of a certain corridor “have their eyes shut so they can’t look up the ladies’ skirts”. Floors, as it happens, are important: green carpets mean you are in the part of the building owned by the Commons; red carpets mean the Lords.

Notices pinned everywhere contribute extra layers of admonition and exhortation. There’s a staircase that may be used only by MPs; a lift that cannot be used if the Lords are in division – that is, voting by walking into separate lobbies. The yeoman usher, described on parliament’s website as “the deputy to the gentleman usher of the black rod”, has a parking space reserved exclusively for his bicycle; a sign says so. In one courtyard there is even a sign advising parliamentarians what to do if they come across a grounded juvenile peregrine, which is try to throw a cardboard box over it. (A pair of the falcons nests on the roof.) The Lords, naturally, specialises in arcane forms of movement control. “Wives of peers’ eldest sons,” reads one notice, “and married daughters of peers and peeresses in their own right, before taking a place in the peers’ married daughters’ box, are requested to leave their names with the doorkeeper at the brass gates.” A different set of rules, needless to say, governs the movement of peers’ unmarried daughters.

Read the story

New York Radical Women and the Limits of Second Wave Feminism

New York Radical Women protest the Miss America Pageant on the boardwalk at Atlantic City, 1969. (Santi Visalli Inc./Archive Photos/Getty Images)

At New York magazine, Joy Press has compiled an oral history of New York Radical Women (NYRW), a collective that existed from 1967 to 1969 and played a large role in defining second wave feminism in the United States. Its founders were generally younger and more radical than the women of the National Organization for Women (NOW), who’d come together in 1966 to address specific legislative failures in Washington, DC. NYRW focused more on elements of the culture that held women back.

The theatrics of the group’s organizing has been seared into the public’s imagination. In 1968, they protested the Miss America pageant by interrupting its telecast, crowning a live sheep on Atlantic City’s boardwalk, throwing objects symbolizing female oppression into a “freedom trash can.” The media called them “bra-burners” for this spectacle, and though nothing caught fire that day, the myth endured.

Along with their image-making, NYRW’s intellectual work, in the form of speeches, essays, pamphlets, and books laid the foundation for women’s studies as an academic discipline. Press explains:

Read more…

The Joys and Sorrows of Watching My Own Birth

JoKMedia / Getty

Shelby Vittek | Longreads | December 2017 | 13 minutes (3,315 words)

 

It’s a hot August night in 1991 at the Greater Baltimore Medical Center, and the delivery room is filled with bright lights. A film crew is documenting a woman giving birth. After almost 12 hours of active labor, it’s time for her to really push.

A few anxious rounds of counting to 10 and many deep breaths later, the doctor says, “Ooooh there you go, lots of hair.”

“That’s it, the baby’s coming!” the red-haired nurse says with excitement.

That’s when I enter the picture, with a head full of red hair of my own.

* * *

I know this scene well. It’s my own birth. Not many people can say they’ve watched their own delivery, but I can.

In fact, I’ve watched myself be born more times than I should probably ever admit to. I’m doing it again tonight for the ninth time this week, sitting on the floor in my studio apartment with my eyes fixated on the television. The sight of my fiery red hair making its debut will never fail to amaze me.

The video of my birth in no way resembles your typical home video. It’s more like a documentary, with my parents and family, and then finally me, as its subjects. Every single reaction of theirs is recorded in the truest manner, and edited as well as early ’90s technology could allow. That’s because it was not shot by a proud father-to-be, but instead a professional film crew. I was paid $300 to be born (the check went directly into my first college fund, I’ve been told), and the footage was used to make an educational video for other expecting parents to watch during Lamaze birthing classes. Hundreds, if not thousands, of other people have watched me be born, too.

Read more…

The Top 5 Longreads of the Week

Jahi Chikwendiu / The Washington Post via Getty Images

This week, we’re sharing stories from Luke O’Brien, Jen Gann, Tom Lamont, Norimitsu Onishi, and Sam Knight.

Sign up to receive this list free every Friday in your inbox. Read more…

A Lonely Death: The Extreme Isolation of Japan’s Elderly

Takashimadaira housing complex in Tokyo. (AP Photo/Katsumi Kasahara)

With a population of 127 million, Japan has the most rapidly aging society on the planet. As Norimitsu Onishi reports at The New York Times, elderly individuals often live in extreme isolation, albeit only a few feet from neighbors on all sides, “trapped in a demographic crucible of increasing age and declining births.” Their fate? A “lonely death” where their body may remain undiscovered in their small government apartment for days (or even years) because family is distant both physically and emotionally, and friends have all long since passed away.

The first time it happened, or at least the first time it drew national attention, the corpse of a 69-year-old man living near Mrs. Ito had been lying on the floor for three years, without anyone noticing his absence. His monthly rent and utilities had been withdrawn automatically from his bank account. Finally, after his savings were depleted in 2000, the authorities came to the apartment and found his skeleton near the kitchen, its flesh picked clean by maggots and beetles, just a few feet away from his next-door neighbors.

Mrs. Ito, age 91, lives alone in a small government apartment built back in the 1960s for up-and-coming salary men.

She had been lonely every day for the past quarter of a century, she said, ever since her daughter and husband had died of cancer, three months apart. Mrs. Ito still had a stepdaughter, but they had grown apart over the decades, exchanging New Year’s cards or occasional greetings on holidays.

So Mrs. Ito asked a neighbor in the opposite building for a favor. Could she, once a day, look across the greenery separating their apartments and gaze up at Mrs. Ito’s window?

Every evening around 6 p.m., before retiring for the night, Mrs. Ito closed the paper screen in the window. Then in the morning, after her alarm woke her at 5:40 a.m., she slid the screen back open.

“If it’s closed,” Mrs. Ito told her neighbor, “it means I’ve died.”

Mrs. Ito felt reassured when the neighbor agreed, so she began sending the woman gifts of pears every summer to occasionally glance her way.

Read the story

Assertiveness Training

Alex Milan Tracy / Sipa via AP Images

Susan Sheu | Longreads | December 2017 | 23 minutes (5,862 words)

In the early 1980s, my mother took a class at the local Wisconsin university’s student psychology center called “Assertiveness Training.” She was awakening belatedly to a version of the mind-expanding youth she had missed by marrying and dropping out of college at age 20 in 1967, during the Summer of Love. The class was taught by Dr. B, who told the students to use “I” statements to ask for what they wanted in plain terms during work and family interactions. (“I am unhappy that you said that to me. I feel that I am not heard when I speak to you.”) The idea was to learn to be assertive but not aggressive, to stop being a silently suffering martyr or someone who holds in all their anger and resentment until it boils over into inappropriate and ineffective rage or self-destructive behavior. It goes without saying that the class was all women. As she immersed herself in college again, my mother began to tell me that when I grew up, I could be anything I wanted — a doctor, a lawyer, a scientist. Even though the Equal Rights Amendment had not been ratified, she wanted me to believe that my future was up to me. Perhaps that was one reason she took Assertiveness Training, to be the kind of mother who raised a daughter who wouldn’t need a class like that.

My grandmother was the model of someone who regularly displayed inappropriate anger, someone my mom was trying to avoid becoming. My grandma Violet had once been docile, and my mom believed that she made the rest of us pay for that false submissiveness for the rest of her life. The short version of my grandmother’s story is that she didn’t marry the man she was in love with because he was Catholic and she was Protestant (this was Nebraska, circa 1928); she didn’t attend college despite receiving a debate scholarship because her mother feigned illness to keep her youngest child at home; and she tried to be a good wife in a marriage with a decent, practical man with whom she was not in love. She ran my grandpa’s restaurant while he was serving in World War II, and when he returned, no longer had any day-to-day responsibilities in the business operations.

By the time I knew her, my grandmother was smoking, alternating between Camels and Newports, drinking gin and, if she was feeling moderate, Mogen David wine (“The Jews” drank it. And Sammy Davis, Jr., “that talented Negro,” was a Jew. It had a screw top. And it was sweet.). She told off anyone who stood in her way, and for decades after her death, my mother made me pretend she was still alive, because it was the memory of my grandma’s fiery temper more than the restraining order that kept my father away. My grandma also took Valium, prescribed by the psychiatrist she began seeing shortly before her death in 1978. I was 9 when she died, but I already knew that her outspokenness and self-medication were a great source of shame for my mom and grandpa.

I’ve since come to understand that my grandma had the appropriate response to her circumstances.

Read more…