In the Washington Post, Libby Copeland follows the story of Alice Collins Plebuch, a 69-year-old woman who believed she was the daughter of Irish Americans until she took a “just-for-fun DNA test” that upended everything she thought she knew about her family history.
Genetic testing companies like 23andMe and Ancestry.com have made it much easier for consumers to learn more about their genealogy and health risks. But home testing kits have also led people to unexpected discoveries:
For adoptees, many of whom can’t access information about their birthparents because of closed adoption laws, DNA testing can let them bypass years, even decades, of conventional research to find “DNA cousins” who may very well lead them to their families.
But DNA testing can also yield uncomfortable surprises. Some testers, looking for a little more information about a grandparent’s origins, or to confirm a family legend about Native American heritage, may not be prepared for results that disrupt their sense of identity. Often, that means finding out their dad is not actually their dad, or discovering a relative that they never knew existed — perhaps a baby conceived out of wedlock or given up for adoption.
In 2014, 23andMe estimated that 7,000 users of its service had discovered unexpected paternity or previously-unknown siblings — a relatively small fraction of overall users. The company no longer provides data on surprise results. However, its customer base has more than doubled since 2014, and now contains more than 2 million people — and as more people get involved with recreational genomics, bloodline surprises are certain to become a more common experience. The 2020s may turn out to be the decade that killed family secrets, for better and for worse.
I don’t know about you, but the last two winters where I live have been plagued by extreme flu seasons, with scores of friends and coworkers complaining about getting sicker than they had in ages. Now that it’s summer, I don’t want to think about the hacking and nose-blowing of winter, but a lot of people have started asking an important question: are pathogens getting stronger?
He decided to start where pharmaceutical chemistry had left off decades earlier: in the messy real-world settings where bacteria duke it out. He launched his campaign, called Swab and Send, in February 2015. For £5, participants got a sample tube, a mailing envelope, and an explanation of what Roberts wanted them to look for: a spot in the environment where bacteria were likely to be competing for nutrition and room to reproduce. He asked them to use their imagination. The less sanitary, the better.
In a departure from the first antibiotic searches, Roberts does not ask his sample-collectors to focus on soil. Instead he wants them to search in places his predecessors may have overlooked. “There’s such a rich microbial environment everywhere around us,” he says. “Every single place is a niche, where bacteria will have evolved and adapted independently. Soil may have evolved biological warfare, if you like, completely differently than a marine environment, or a muddy environment, or contaminated pond water. There’s a possibility of different chemistry everywhere.”
The Swab and Send campaign fired people’s enthusiasm: Within two months, Roberts received more than £1,000, and hundreds of swabs. Small checks continue to arrive by mail. (The price of participation has gone up, to £30 for five swabs.) Elementary schools invite Roberts to make presentations, and he gives the kids swabs to take home. He has taken sample tubes to parties and to newsrooms. He has two swabs that were swiped across desks in the Houses of Parliament.
When James Vlahos learned that his father, John, was diagnosed with stage IV lung cancer, he set out to save his dad’s memories by programming them into a chatbot that he could pull up from his phone. The chatbot, nicknamed “Dadbot,” could emulate John in text conversations, bringing up various stories from John’s life. Vlahos wrote about his experience in Wired, and recounts the first time his mother interacted with “Dadbot”:
The Dadbot gives my mom a few other pointers, then writes, “Finally, it is critical that you remember one final thing. Can you guess what it is?”
“Not a clue.”
“I will tell you then. The verb ‘to be’ takes the predicate nominative.”
My mom laughs as she reads this stock grammar lecture of my father’s. “Oh, I’ve heard that a million times,” she writes.
“That’s the spirit.” The Dadbot then asks my mom what she would like to talk about.
“How about your parents’ lives in Greece?” she writes.
I hold my breath, then exhale when the Dadbot successfully transitions. “My mother was born Eleni, or Helen, Katsulakis. She was born in 1904 and orphaned at three years old.”
“Oh, the poor child. Who took care of her?”
“She did have other relatives in the area besides her parents.”
I watch the unfolding conversation with a mixture of nervousness and pride. After a few minutes, the discussion segues to my grandfather’s life in Greece. The Dadbot, knowing that it is talking to my mom and not to someone else, reminds her of a trip that she and my dad took to see my grandfather’s village. “Remember that big barbecue dinner they hosted for us at the taverna?” the Dadbot says.
A New York Magazine story on climate change is making the rounds on the internet, frequently being shared by people characterizing it as a “terrifying” “must-read.” “It is, I promise, worse than you think,” writes David Wallace-Wells, who goes on to tell his readers that even the most anxious among them are unaware of the terrors that are possible “even within the lifetime of a teenager today.”
What many readers seem to be overlooking is how frequently words like “may” appear in the text of Wallace-Wells’ article. “May” is in there seven times; “suggest” six times, “possible” and its variants a few more. Wallace-Wells is, of course, referencing the positions of scientists, whom he says have become extra cautious due to “climate denialism,” steering the public away from “speculative warnings” that could be debunked by future scientific progress, weakening their own case and giving weight to their opponents.
As Jack El-Hai wrote for Longreads in April of this year, science editor Peter Gwynne is still dogged by an article he wrote for Newsweek more than 40 years ago, “The Cooling World,” which predicted — wrongly, as it turns out — another Ice Age. The prediction at the time was supported by evidence, he claimed, that was mounting so quickly, “meteorologists are hard-pressed to keep up with it.” The evidence Gwynne relied on has since been disproved — a phenomenon not uncommon a field as relatively young as climate study. As El-Hai noted:
The study of the world’s climate was still primitive in the 1970s. Few meteorological scientists then knew how to interpret trending temperature information, and the cause of climate changes was mysterious. The information that climate researchers had collected was incomplete and easy to misread. The biosciences have advanced by huge leaps since then, and many more scientists now study the climate.
Gwynne’s article was used for decades as fodder by those who trade in what Wallace-Wells dubs “climate denialisms,” showing how those determined not to believe in a certain scientific finding can benefit from the natural trial-and-error of most scientific inquiry.
After Wallace-Wells’s piece was published, climate scientist Michael E. Mann took to Facebook to criticize his story. (He also claimed Wallace-Wells interviewed but didn’t quote or mention him). Mann is equally critical of “doomist framing” and “those who understate the risks” of climate change, and argues that Wallace-Wells’ article includes “extraordinary claims” without “extraordinary evidence” to back it up.
About the risk of catastrophic methane released by melting permafrost, for example, Mann says the science “is much more nuanced and doesn’t support the notion of a game-changing, planet-melting methane bomb. It is unclear that much of this frozen methane can be readily mobilized by projected warming.”
Mann also highlights Wallace-Wells’ referencing of “satellite data showing the globe warming, since 1998, more than twice as fast as scientists had thought.”
“That’s just not true,” writes Mann. “The study in question simply showed that one particular satellite temperature dataset that had tended to show less warming that the other datasets, has now been brought in line with the other temperature data after some problems with that dataset were dealt with… The warming of the globe is pretty much progressing as models predicted… which is bad enough.”
Mann’s position is that the evidence supporting the notion that climate change is “a serious problem that we must contend with now” is overwhelming enough without a doomsday narrative that he fears has a “paralyzing” effect and makes people feel hopeless, potentially deterring efforts to mitigate the human-caused harm.
There’s an argument to be made in defense of Wallace-Wells’ meltdown-style writing, however. As Atlas Obscura staff writer Sarah Laskow noted on Twitter, the exploration on which he embarks is hardly novel — New Yorker writer Elizabeth Kolbert won a Pulitzer this year for a book on the topic, The Sixth Extinction: An Unnatural History. Yet, Wallace-Wells’ story got readers’ attention in a way that seemed to suggest it was news they’d never encountered before.
For scientists like Mann, it’s true that the evidence easily at our fingertips is compelling enough to warrant immediate mitigating efforts. But not everyone is a scientist like Mann. As El-Hai noted in his piece on Gwynne’s disproved Newsweek article, a U.S. Senator held up a snowball on the Senate floor in 2015 as part of an argument that global warming isn’t real. Today, Antarctica is poised to shed one of the largest icebergs ever recorded, while the Trump administration is abandoning international climate agreements, undoing dozens of environmental regulations dealing with everything from methane to grizzly bears to chemical spills and using the agency meant to protect the environment to launch a program challenging climate science. In light of all that, it’s just as easy to sympathize with Mann’s concern about making people feel so hopeless they believe there’s nothing left to be done, as it is hard to blame Wallace-Wells for despairing.
Some physicians in South Korea are working to understand the differences in healthcare across the DMZ and health issues North Korean defectors face, in preparation for eventual reunification — not easy when the medical tools Northern Korean physicians have are so drastically outdated and when support for reunification is dropping in the South. At Undark, Sara Talpos talks to the doctors trying to bridge these gaps.
The practice of medicine is sharply different in the two countries. In North Korea, the focus is on infectious disease and physical trauma, often caused by coal-mining injuries. Doctors learn only the basics of other diseases because specialized medicines and equipment — chemotherapy for cancer, for example — simply aren’t available.
Ko laughs when I tell him I’ve heard North Korean X-ray images are so poor that a South Korean doctor wouldn’t be able to understand them. “Yes, that’s true,” he says, sipping a cup of coffee. We’re meeting at Steff Hotdog, a fast-food restaurant located, somewhat improbably, inside Anam Hospital. “That’s because they don’t have X-ray film.” Instead, the doctor takes the patient into a dark room, where the patient stands between the X-ray machine and a translucent screen. Ko borrows my pen to illustrate. His doctor sits hunched over on a stool like Rodin’s “The Thinker.”
In the 2000 film Unbreakable, we’re introduced to two characters at opposite ends of a spectrum: an extremely frail man with a brittle bone disease played by Samuel L. Jackson, and a man with superhuman levels of strength and invulnerability played by Bruce Willis.
“However unreal it may seem, we are connected, you and I,” Jackson’s character tells Willis’. “We’re on the same curve, just on opposite ends.”
In a recent issue of Wired reporter Erika Hayasaki introduced us to another set of people on the opposite ends of a spectrum.
Steven Pete has a rare neurological condition that makes him unable to feel pain.
Pete pauses for a moment and recalls a white Washington day a few years ago. “We had thick snow, and we went inner-tubing down a hill. Well, I did a scorpion, where you take a running start and jump on the tube. You’re supposed to land on your stomach, but I hit it at the wrong angle. I face-planted on the hill, and my back legs just went straight up over my head.” Pete got up and returned to tubing, and for the next eight months he went on as usual, until he started noticing the movement in his left arm and shoulder felt off. His back felt funny too. He ended up getting an MRI. “The doctor looked at my MRI results, and he was like, ‘Have you been in a car accident? About six months ago? Were you skydiving?’ ”
“I haven’t done either,” Pete replied.
The doctor stared at his patient in disbelief. “You’ve got three fractured vertebrae.” Pete had broken his back.
Pam Costa has the opposite neurological condition — she feels pain constantly, as if her body is on fire.
Because the inflammation is exacerbated by physical contact, stress, and even the smallest elevation in surrounding temperature, Costa lives her life with great care. She wears loose-fitting clothes because fabric feels like a blowtorch against her skin. She sleeps with chilled pillows because the slightest heat makes her limbs feel like they are crackling. “Have you ever been out in the bitter, bitter cold, where your feet were ice?” she asks me. “Almost frostbite? Then you warm them up and it burns? That burning sensation: That is what it feels like all the time.”
Pete and Costa are also connected, sharing a genetic link that has helped scientists understand why we experience pain and how to treat it.
At Seattle Met, James Ross Gardner reports on the surprising social arrangements and habits of crows, who recognize and remember individual people and hold funerals to honor their dead — a phenomenon that is helping scientists like Kaeli Swift understand how intelligent creatures process death. Feed a crow and she will gift you with keys and candy as tokens of her appreciation. Treat her poorly and she and her corvid compatriots may mob you on sight.
But what if I were to tell you that the crows you spy in your yard are almost always the same individual crows? That those birds—usually two, a male and a female known as a territorial pair—don’t live there but fly in every day from 20 miles away? During the day urban crows rummage and build nests in a specific spot, in a specific neighborhood, then decamp for the evening to a massive, crowded roost outside the city—their own crow planet— and report back to the neighborhoods each morning. Like you, they commute to work.
At my house, we’d noticed the lack of bees this spring, but we’d chalked it up to the late arrival of warm weather and a rainier-than-usual season. Turns out our anecdotal data gathering isn’t entirely off — there are fewer bugs. Of all kinds.
Entomologists call it the windshield phenomenon. “If you talk to people, they have a gut feeling. They remember how insects used to smash on your windscreen,” says Wolfgang Wägele, director of the Leibniz Institute for Animal Biodiversity in Bonn, Germany. Today, drivers spend less time scraping and scrubbing. “I’m a very data-driven person,” says Scott Black, executive director of the Xerces Society for Invertebrate Conservation in Portland, Oregon. “But it is a visceral reaction when you realize you don’t see that mess anymore.”
Some people argue that cars today are more aerodynamic and therefore less deadly to insects. But Black says his pride and joy as a teenager in Nebraska was his 1969 Ford Mustang Mach 1—with some pretty sleek lines. “I used to have to wash my car all the time. It was always covered with insects.” Lately, Martin Sorg, an entomologist here, has seen the opposite: “I drive a Land Rover, with the aerodynamics of a refrigerator, and these days it stays clean.”
Though observations about splattered bugs aren’t scientific, few reliable data exist on the fate of important insect species. Scientists have tracked alarming declines in domesticated honey bees, monarch butterflies, and lightning bugs. But few have paid attention to the moths, hover flies, beetles, and countless other insects that buzz and flitter through the warm months. “We have a pretty good track record of ignoring most noncharismatic species,” which most insects are, says Joe Nocera, an ecologist at the University of New Brunswick in Canada.
What exactly does a bullet do to flesh as it careens through the body? At Highline, Jason Fagone profiles Philadelphia trauma surgeon Dr. Amy Goldberg, a woman on the front lines of gun violence as she attempts to repair the broken bodies that arrive daily at Temple University Hospital. Dr. Goldberg doesn’t only fix the damage, she’s also working to prevent it. After a patient died the third time he was shot, she worked with friend and coworker Scott Charles to create a social program, Turning Point, which has been instrumental in stopping gun violence before it starts.
More than 30,000 people die of gunshot wounds each year in America, around 75,000 more are injured, and we have no visceral sense of what physically happens inside a person when he’s shot. (Dr. Amy) Goldberg does.
“The creation of a person, you know. It’s the heart beating and the lungs bringing air. It is so miraculous.” Surgery, for Goldberg, was a way of honoring the miracle. And trauma surgery was the ultimate form of appreciation, because a surgeon in trauma experienced so much variety. She might be operating on the carotid artery in the neck, or the heart in the chest, or the large bowel or small bowel in the abdomen, or the femoral artery in the thigh, at any given moment, on any given night.
“As a country,” Goldberg said, “we lost our teachable moment.” She started talking about the 2012 murder of 20 schoolchildren and six adults at Sandy Hook Elementary School. Goldberg said that if people had been shown the autopsy photos of the kids, the gun debate would have been transformed. “The fact that not a single one of those kids was able to be transported to a hospital, tells me that they were not just dead, but really really really really dead. Ten-year-old kids, riddled with bullets, dead as doornails.”
Christian H. Cooper made his way from Appalachia to Wall Street, and from poverty to wealth. But is it because he worked harder than the family and friends still struggling in East Tennessee, or was it luck? In Nautilus, he digs into the emerging science of epigenetics to look at the way poverty actually changes our genetic expression, and therefore our physiology. If poverty has treatable physical aspects, what does that mean for economic policy, social policy, and politics? What does it mean for the American ideal of meritocracy?
Now, new evidence is emerging suggesting the changes can go even deeper—to how our bodies assemble themselves, shifting the types of cells that they are made from, and maybe even how our genetic code is expressed, playing with it like a Rubik’s cube thrown into a running washing machine. If this science holds up, it means that poverty is more than just a socioeconomic condition. It is a collection of related symptoms that are preventable, treatable—and even inheritable. In other words, the effects of poverty begin to look very much like the symptoms of a disease.
That word—disease—carries a stigma with it. By using it here, I don’t mean that the poor are (that I am) inferior or compromised. I mean that the poor are afflicted, and told by the rest of the world that their condition is a necessary, temporary, and even positive part of modern capitalism. We tell the poor that they have the chance to escape if they just work hard enough; that we are all equally invested in a system that doles out rewards and punishments in equal measure. We point at the rare rags-to-riches stories like my own, which seem to play into the standard meritocracy template.