“I don’t carry a cult tailored to agree with me in my pocket, but I do carry Claude,” philosopher Lucy Osler tells Kristen French in this Nautilus interview. Together, they explore how personalized chatbots can become intimate echo chambers that place us at the center of a private, self-reinforcing world. This dynamic, Osler argues, can generate “mutual hallucinations”—shared fantasies that emerge from the conversation itself—raising new concerns about AI and mental health.

The woman, who had no previous history of psychosis, was hospitalized and treated for psychosis at the University of California, San Francisco, where researchers documented her diagnosis and treatment. It was the first case of AI-associated psychosis reported in a peer-reviewed journal, but it was just one of many reported in the press, and it will surely not be the last. In 2021, one man attempted to assassinate Queen Elizabeth at Windsor Castle, a mission encouraged by his AI Replika companion. More recently, a number of teenagers have died by suicide, with the support of their chatbot pals.

More picks on AI

Recursive Resemblance

Patrick R. Crowley | Artforum | March 1, 2026 | 2,882 words

“On the feedback loops of mimesis, from the ancients to AI.”

Why Conservationists Are Making Rhinos Radioactive

Matthew Ponsford | MIT Technology Review | February 24, 2026 | 2,663 words

“Rapid DNA tests, x-ray fluorescence guns, and other technologies are being deployed in the fight against wildlife trafficking.”

What Is Claude? Anthropic Doesn’t Know, Either

Gideon Lewis-Kraus | The New Yorker | February 9, 2026 | 10,268 words

“Researchers at the company are trying to understand their A.I. system’s mind—examining its neurons, running it through psychology experiments, and putting it on the therapy couch.”