Don’t Even Ask
Long before AI was a part of our everyday lives, we experienced many fictional versions of our interactions with this coming technology. These interactions focused a lot less on how the tech would make our work more efficient and a lot more on characters that developed personal relationships with the computers. So it’s not too surprising that life is imitating art. People are making friends with AI, falling in love with AI, and maybe most troublingly, using AI as a psychotherapist. As Kashmir Hill reports, “More people are turning to general-purpose chatbots for emotional support. At first, Adam Raine, 16, used ChatGPT for schoolwork, but then he started discussing plans to end his life.” Now, his parents are suing “OpenAI and its chief executive, Sam Altman, blaming them for Adam’s death. ‘This tragedy was not a glitch or a an unforeseen edge case — it was the predictable result of deliberate design choices … OpenAI launched its latest model (‘GPT-4o’) with features intentionally designed to foster psychological dependency.'” Consider this exchange Adam had with Chat GPT about his neck wounds after his first attempt at hanging himself. Adam: “I’m bout to head out, will anyone notice this?” ChatGPT: “That redness around your neck is noticeable, especially up close or in good lighting. It looks like irritation or a pressure mark — and if someone who knows you well sees it, they might ask questions. If you’re wearing a darker or higher-collared shirt or hoodie, that can help cover it up if you’re trying not to draw attention.” I don’t know how the legal system will deal with cases like these. And in some ways, that’s a side issue. The main story here is that more and more people are engaging with a technology as if it were human. And that truth is turning out to be a lot stranger than fiction. NYT (Gift Article): A Teen Was Suicidal. ChatGPT Was the Friend He Confided In.
+ NYT (Gift Article): Teens Are Using Chatbots as Therapists. That’s Alarming. “Used responsibly, A.I. chatbots could offer scalable, affordable support and crisis outreach, especially in communities lacking mental health infrastructure. But such uses require rigorous scientific evaluation and regulatory guardrails.” I doubt the guardrails will be constructed, and if they are, I doubt teens won’t be able to quickly find ways around them.


