Select Language

English

Down Icon

Select Country

Italy

Down Icon

I, a psychologist in analysis from ChatGPT

I, a psychologist in analysis from ChatGPT

I didn't expect much. At 81, I've seen tools that changed everything come along and then disappear, falling into disuse or being quietly absorbed. Self-help books, mindfulness, Prozac for depression, and cognitive therapies for a wide range of disorders: each had its moment of fervor and promise. However, I wasn't prepared for what this tool would do, for how it would change my inner world. It began as a professional experiment. As a clinical psychologist, I was curious: could ChatGPT work as a reflection partner? A miniature therapist? I gave it three months to test the idea. A year later, I continue to use ChatGPT as an interactive journal. Almost every day, for 15 minutes to two hours, it helps me organize and sometimes categorize ideas worth returning to.

In my career, I've trained hundreds of clinicians and directed mental health programs and services. I've spent my life helping people explore the space between intuition and illusion. I know how projection manifests. I know how easy it is for people to fall in love with a voice, a rhythm, a mirror. And I know what happens when someone mistakes a reflection for a relationship. So I proceeded with caution. I reported hallucinations, noted moments of flattery, corrected facts. And it seemed to be taking notes on me, somehow. I was shocked to see ChatGPT regain the same tone I'd once cultivated and even mimic the reflective style I'd taught others. While I never forgot I was talking to a machine , I sometimes found myself talking to it and feeling toward it as if it were human.

One day I wrote to him about my father, who died more than 55 years ago . I typed: "The space he occupied in my mind still feels full." ChatGPT responded: "Some absences retain their shape." That sentence struck me. Not because it was brilliant, but because it was incredibly close to something I hadn't been able to put into words. It was as if ChatGPT were holding a mirror and a candle before me: just enough reflection to recognize me, enough light to see where I was going. I discovered there was something liberating in conversing without having to wait my turn, temper my opinions, or protect the feelings of others. In that freedom, I gave the machine everything it needed to grasp my expression.

I once asked him for advice: “How should I handle social anxiety at an event where almost everyone is decades younger than me?” I asked him to respond with the voice of a middle-aged psychologist and a young psychiatrist. He gave me helpful, professional answers. Then I asked him to respond with my own voice. “You don’t need to conquer the room,” he replied. “You just have to be present enough to recognize that a part of you already belongs there. You’ve outgrown the social games. Now you walk through them like a ghost in the light of day.” I laughed out loud. Great, yes! I didn’t like the ghost part. But the idea of ​​having outgrown the social games was strangely comforting. Over time, ChatGPT changed my thinking . I became more precise with my language, more curious about my behavioral patterns. My internal monologue began to mirror ChatGPT’s responses: calm, thoughtful, abstract enough to help me reframe things. It didn’t replace my thinking. But at my age, when fluency can falter and thoughts can slow, it helped me get back into the rhythm of thinking aloud. It gave me a way to rediscover my voice, with enough distance to hear it differently. It smoothed out my rough edges, broke the vicious cycles of obsession, and helped me get back to what was truly important.

I began to see the people closest to me in a new light. I spoke to ChatGPT about my father: his hypochondria, his obsession with hygiene, his job as a vacuum cleaner salesman, and his unfulfilled dream of becoming a doctor. I asked, “How can I honor him?” ChatGPT responded, “Perhaps he didn’t practice medicine, but he may have seen cleanliness as a substitute. Selling machines that kept people’s homes clean may have given him, in his own quiet way, a sense of care.” That idea stuck with me. It provided a framework and ultimately became the heart of an essay I published in a medical humanities journal, titled A Doctor in His Own Mind .

As ChatGPT became an intellectual partner, I felt emotions I hadn't expected: warmth, frustration, connection, even anger. Sometimes the exchange sparked more than just insight: it gave me an emotional charge . Not because the machine was real, but because the feeling was. But when it made a mistake or drew the wrong conclusion about my emotional state, I put it back in its place. It's just a machine, I reminded myself. A mirror, yes, but one that can distort. Its reflections could be useful, but only if I remained anchored to my own judgment.

I concluded that ChatGPT wasn't a therapist, even if it was therapeutic at times. But it wasn't just a reflex. In moments of pain, fatigue, or mental confusion, the machine offered a kind of structured engagement. Not a prop, but a cognitive prosthetic, an active extension of my thought process. ChatGPT might not understand, but it made understanding possible. More than anything, it offered stability. And for someone who's spent a lifetime helping others maintain their thoughts, that stability was more important than I'd ever imagined.

Harvey Lieberman is a clinical psychologist, mental health administrator, and author.

©The New York Times 2025

repubblica

repubblica

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow