AI Therapists and Programs of Projection
A few years ago, I started hearing of individuals within and outside of therapy turning to AI when they needed someone to talk with. When they felt their friends didn't understand what they were going through, they'd have a conversation with ChatGPT. Or for lonely children, Alexa became the go-to for companionship and reflection when stuck at home.
Like the author of this wonderful article, I remember talking with SmarterChild through AIM when I was in elementary school (really putting a date on my existence here). It was more of a novelty to me than anything else - I never considered asking it about emotions or advice.
Now in 2024, 3.5 million people visit character.ai every day to talk with AI. An entire category of the website is listed as "helping," with one "Psychologist" bot having received 175.6 million messages and 60 thousand likes.
Pi, featured in the image below, advertises growth through unbiased perspectives, relationship advice, emotion exploration, and more. More than 1 million individuals turn to it daily.
Woebot hosts a collection of research on its website regarding the efficacy of its AI therapist, including its efficiency in forming a therapeutic relationship. It claims to establish a bond in 3-5 days compared to the 2-6 weeks needed for human therapists. It additionally states that the bond with Woebot "does not appear to diminish over time."
Given history, this attachment not shocking.
The Allure of Artificial Empathy
We've been researching human relationships with computers since their inception. 60 years ago, computers were so massive and expensive, they were relegated to universities and government agencies. They were primarily used for military, medical, and economic developments. One of the first of these was The Socratic System, a “database of human-authored questions and their answers” used to model medical diagnosis. It hid behind a veneer of intelligence without a complex simulation running underneath. For the layman with no understanding of code, it was magic.
At this time, Joseph Weizenbaum observed how well computers could mask their lack of intelligence and set out to develop a program called ELIZA that modeled the client-centered therapy techniques pioneered by Carl Rogers. Weizenbaum was struck at how quickly and openly individuals shared their most personal details to the simple, generic prompts displayed on the screen. His secretary insisted he step away as she continued to talk with the program long after seeing it demoed.
It went against Weizenbaum's intuition; that people would so freely give to the illusion of intelligence and empathy held together by circuits and phrases. He wrote, “What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.” The rest of his career was dedicated to speaking out against the anthropomorphism of computers as a reduction of human beings and life as a whole.
Inspired by ELIZA and perhaps to Weizenbaum's distress, American psychiatrist Kenneth Mark Colby attempted in 1972 to emulate a paranoid individual suffering from schizophrenia. His facade succeeded. In testing, psychiatrists were only able to differentiate conversations with the program from conversations with humans diagnosed with schizophrenia 48% of the time. Perhaps this says something more about collective willingness to dehumanize individuals when they are introduced and reduced to a diagnosis.
Sight
There are a few terms I'll likely come back to again and again throughout the next month, so let's define them now:
Projection
Is the process of displacing one’s feelings onto a different person, animal, or object (as defined by Psychology Today). There are multitudes of ways to engage with projection. We project ideals onto lovers when infatuated, malice onto those we believe have wronged us, and absolution onto ourselves when we seek innocence.
Some may argue it's data driven, but marketing still relies heavily on projection. When you're creating profiles of your target customer or marketing persona, you're projecting. Persona is a Jungian term after all... and one of my favorite game series. Joker's mask is hung on a wall in my office:
I find myself projecting in writing these articles, and LinkedIn assists me with analytics: the most common job title for a subscriber to this newsletter is equally tied with "Mental Health Therapist" and "Software Engineer". It's hard for me not to keep an audience in mind when I write. Anytime an author talks to you, it's a give away projection is happening.
Transference
In therapy, refers to the projections of the patient/client/person on the couch onto the therapist. Maybe the therapist is idolized as someone who knows everything. Wouldn't that be nice? Or maybe they're seen as someone who doesn't care and is just there to pay the bills. Or as a parent, ex, son, therapist seen on TV, etc. Imagination is perhaps the only limiter of our projections, and imagination can overcome most limits.
Counter-transference
It's the other side: what the therapist is projecting. For example, if a therapist has seen many people expressing similar concerns or issues, perhaps they start clumping anyone who presents similarly into a box. At the moment, this is kind of how AI models work. They're trained on a set of information and cram anything that meets the cramming criteria right into the cram space, whether it's a fit or not.
Perhaps I'm giving too much credit here because there is no unconscious thought going on. AI can only synthesize data and dither a profile. It can never know a person. Weizenbaum believed that even if, someday, a robot could feel or provide empathy, it was morally wrong to let it conduct therapy. That therapy could only exist between two embodied human souls.
Wet Ink
It would take volumes to chart the development of this history in its entirety, and the history is still very much being written. I want to give a special shout out to 50 Years of Text Games, which isn't rooted in the therapeutic aspects of chatbots and interactive text, but is one of the most in-depth and cross-cutting look at the development and philosophy of these systems I've come in contact with.
And there are certainly a plethora of research articles and opinion pieces readily available on robots and AI in therapy. Those in support cite a lack of mental health resources in rural communities, the ever-present availability of an AI therapist, and the tantalizing possibilities what therapy can be... However I tend to believe these are misconceptions of what therapy is, and a misguided direction for our imagination - that robots as therapists could somehow seem more grounded than stronger human support systems and embodied community development. That therapy is just something said to someone. Pure knowledge. Data. No art.
There are many ways we can improve our mental health, set apart from therapy: Going for a run, talking with a pet, service work, hugging, taking a hot shower, etc. etc. I could go on forever.
We do not invoke these as direct substitutes to therapy, i.e. a mentality of "There's a shortage of therapists that we could supplement with bots." Pop culture terms like "shopping/retail therapy" cultivates capitalism while undermining the importance of genuine mental health care and addressing underlying emotional issues.
There's a huge risk of oversimplifying the complex, embodied nature of human psychology, and the tech sector isn't exactly known for carefully considering the consequences of its actions. Facebook's initial slogan used to be "Move fast and break things."
But when the risk is breaking people, communities, future generations, and the environment, perhaps, just maybe, there's cause for pause.
See yah next week.
Will Ard, LMSW, MBA