When AI Entered the Therapy Room

Close-Up Shot of Two People Playing VR Box

Exploring what happens when artificial intelligence meets one of the most human crafts of all: therapy.

When I first started thinking about artificial intelligence and its impact on our humanity, I felt oddly reassured about my profession. I told myself how lucky I was to have chosen a line of work grounded in human connection. Poor software engineers and copy editors, I thought, and what about journalists and IT supporters? Surely therapy, one of the oldest crafts of care, must be immune to the encroachment of technology.

But almost overnight, the tables turned. Suddenly, companionship and therapeutic support became primary use cases for AI tools like ChatGPT.

Soon I began hearing stories from people around me. It wasn’t just that more folks began dabbling with therapy bots, like Abby, Ash, Woebot and Wysa. A writer friend who’d been vehemently against generative AI for content creation, turned to ChatGPT to calm herself in the early hours when her therapist and friends were asleep. A researcher friend told me she had gotten into comparing a chatbot’s dream interpretations to her therapist’s. Both insisted they’d never use AI for their professional work, but for self-soothing, reflection, or late-night anxiety, it had suddenly become useful. 

Free report access
AI in and Outside the Therapy Room

AI Has Already Entered the Therapy Space

Like my friends, many clients are already using AI tools for journaling, venting, or simulating supportive dialogue between sessions, even if they never mention it to their therapist. Some therapists are experimenting with AI for scheduling, notes, or generating psychoeducational materials. Others use it to articulate insights that might otherwise stay half-formed.

In our recent report on AI in the Therapy Room, more than 60% of surveyed clients said they’ve interacted with some form of mental health chatbot or journaling assistant. Many described it as helpful but hollow – a place to start unpacking feelings, though not one they would ever fully trust. Around 70% worried about data privacy, and 82% said they wouldn’t want AI to replace any part of human therapy. Still, over half were open to AI as a companion, a gentle, low-stakes space to explore thoughts before bringing them into therapy.

Therapists, meanwhile, are split between curiosity and unease. About 40% are already using AI, mostly for practical tasks like writing or translation. But when it comes to clinical use, the tone shifts. Many fear bias, inaccuracy, and a gradual dehumanization of therapy. They worry that something essential will be lost when reflection becomes synthetic and automated.

The Quiet Pragmatism of Care

Still, beneath the ambivalence runs a quiet pragmatism. Many see potential in AI if used thoughtfully. Not to replace connection, but to lighten the invisible labor of therapy, like note-taking, summaries, documentation. The common thread was clear: AI should never take precedence over the intimacy of the therapy space.

Reading through the data, what stands out most is a deep longing, for better care, less burnout, and systems that support rather than strain. In this sense, AI becomes a mirror, reflecting what we wish therapy, and perhaps the world around it, could be: more accessible, more responsive, more connected.

AI in and Outside the Therapy Room
Confronting the truth about AI and therapy
Our free report gives an in-depth look at the most up-to-date trends in AI usage in the context of psychological therapy. To learn what this means for the future of care, visit our report page.

What Should Remain Human?

At It’s Complicated, we’ll keep exploring what an ethical, human-centered integration of technology might look like. How can we use it to support therapists and clients without eroding what makes therapy human? What tools could ease the load while preserving the soul of the work?

The answers won’t come from technology itself, but from the questions we dare to ask about what it means to care and what we want to protect. Be it the effort, rupture, and repair inherent in human relationships, or the broader ecological context we all inhabit.

Perhaps that’s where the heart of this conversation lies: in staying present with the paradox. AI has made instant emotional support available to anyone with a device, yet this same progress stirs unease. Overreliance on chatbots has led to devastating consequences, while the data centers that power them consume vast amounts of drinkable water in a world already burning and falling short of its climate goals.

Maybe the work of therapy, and of being human, is to stay grounded and steadfast in our humanity, our compassion, and our hope that we can still find each other in an increasingly disjointed world.