Matthew S. Burns on AI, Empathy, and the Making of Eliza
In 2023, Museum of Design Atlanta held an exhibit titled "Level Up: Pixels, Play, and Progress" which showcased games dedicated to activism, human connection, and empathy. I made an effort to play every game exhibited in its entirety, and the one that felt most adjacent to my work was Eliza.
Released in 2019 by Zachtronics, Eliza puts players in the role of Evelyn, a former tech worker who returns to the workforce as a human proxy for an AI counseling program. Through this proxy system, Evelyn reads scripted responses generated by the AI while listening to clients pour out their hearts - a setup that forces players to grapple with fundamental questions about human connection, the role of technology in mental healthcare, and the true nature of empathy. Sound familiar?
I'm exceptionally grateful for the opportunity to talk with Matthew S Burns, the writer, composer, and director of Eliza. Matthew has worked on numerous video games: as a producer for Halo 3 and Halo: ODST, writer of interactive fiction, video games, and traditional fiction, and as composer/remixer for games like Fortune's Foundation and Celeste. His work adds up to an eclectic suite of projects, and they're always neat.
Looking Back
Will: It’s been like five years since Eliza came out, and there's been a big change in AI and awareness around AI.
Matthew: Yeah, definitely. Over the last couple years, AI has shot to the top of the hype cycle (laughing) so there’s a lot more interest in using AI for everything. Including therapy.
Will: Agreed. If you were making Eliza today, would you do anything different?
Matthew: Yeah, that’s an interesting question. I might have made Eliza look a little bit smarter initially. You kind of start to see the flaws pretty quickly because it’s implied that the system is not necessarily smart on the back-end, and with the rise of ChatGPT and other natural language systems, it’s fooling a lot more people into thinking they’re having a real conversation, and so, maybe I would dive a little bit more deeper into what that really means and does it matter? I think most of the rest of it I would keep the same.
Will: I think you nailed it.
Matthew: (laughing) Thanks! Thank you.
Will: And that was the original concern of Weizenbaum, the way the "real world" Eliza fooled people.
Matthew: That’s the thing I think is perennial about it, and why even though I made it in 2019 when the AI wave hadn’t come yet, Weizenbaum was thinking about this stuff in the 60’s. He'd already formulated some of the problems and dangers that came when computers kind of trick people into thinking that they’re interacting with human beings. So he was at the forefront of thinking about that stuff, and those concerns haven’t gone away since then, right? It’s only become more and more relevant. It was very, very interesting to go back in time, read his writings from the era, and think about what he discovered in the context of what we have today.
Characters
Will: You did an excellent job of writing people and specifically people in therapy. How did you approach that?
Matthew: It was combination of a couple things. One is that I have been in therapy myself. So, I’m kind of aware of the language that’s used. I also did research on the kinds of training therapists get, and I read CBT manuals and things like that. So there are, you know, more manualized therapies and even highly scripted versions of talk therapy. I read about that.
Then in terms of talking from a voice of a person, I used writing exercises like freewriting in someone’s voice for at least a thousand words or a couple pages. Just trying to sink into their character and think about what they’re thinking about. Then adapting that material into the dialogue.
Will: To get a better idea of the characters, did you write from any diagnosis?
Matthew: No. I did not diagnose anyone. I didn’t really feel it was my place to do that, and I didn’t want to be too clever like, “Ha ha! This person’s problem is this! And I’m gonna write to that.” Instead, I wanted things to be a little bit more mysterious and interpretative. Part of why the game makes you spend so much time just talking to people in therapy is this idea I wanted to get across that… They’re just real people. They’re just irreducible from who they are. And even though you’re in this technological system that’s trying to categorize them, they can’t be. And hopefully you realize that when you have a lengthy conversation with them or multiple conversations it’s like, "Well they kind of fit this, but they kind of don’t." Because they’re human beings. Right? So, yeah I didn’t work to any specific diagnoses or anything like that.
Will: I love that. A therapist working only from a model of diagnosis could end up not treating people like people. The game as a whole is a really good examination of that. The people seemed very peopley. Like Holiday, with a focus on money. Or Mark being mandated to see Eliza. (laughs)
Matthew: (laughs) Yeah, like a non-cooperative sort of person.
Will: Can you say more about Rae?
Matthew: I’ve spent my career in the games industry which is sort of tech-adjacent. Part of that time was at Microsoft. I was working in Xbox, but obviously you see the rest of Microsoft, and I live in Seattle. There’s just a lot of these people around, people who see tech jobs as their way to get a better life, and because they’re part of what they see as changing the world for the better, they get behind it. And so, the character of Rae is someone who is not necessarily a bad person but has bought into the propaganda of the place where she works. Where there are a lot of that kind of person. Because part of that is just that it’s… easier? It’s convenient for you, if you are literally spending your life contributing to the mission of a big giant tech company, you don’t necessarily feel like questioning what you’re doing.
Because if you do question what you’re doing, then maybe the answer is that you’ve actually wasted your life. So there’s a certain amount of convincing yourself that what you’re doing is worth it. That what you’re doing is right. I think Rae, to me, reflects those people that I know who are not bad people, but get caught up in the mission of their company and believe the hype. Maybe because they really do, or maybe its just kind of a convenient thing. I’m not really sure.
Will: Yeah, maybe they need to.
Matthew: Yeah! Because they need to, you know? If you start to believe that it’s not worth it, then a lot of stuff in your life comes crumbling down that you’ve built up, right? So people need to uphold this, “Yeah, we’re changing the world! We’re helping people!” That’s where Rae comes from. For me.
Matthew: I occasionally hear from therapists and it’s really great. I couldn’t ask for anything better than that. I also occasionally hear from people working on AI therapy tools. (laughing)
Will: Oh, are they not happy? (laughing)
Matthew: Well, no. Some of the people that I hear from are the people who've been working at a start-up for a year or two and they’re starting to wonder if they're actually doing the right thing. They’re thinking “Should I actually keep working on this or should I do something else” and are in almost the exact position that Evelyn is in the game. Of like, “Wow, maybe I shouldn’t be working on this” (Will laughs) “Maybe I could do something else?”
Will: “Maybe I’m the bad guy?”
Matthew: Right, right exactly. And so a couple people have reached out to me and chatted about that as well. Which is kind of interesting.
The Digital Empty Chair
Will: Do you have any thoughts on AI in conversation?
Matthew: A lot of it is the stuff we were saying earlier where there’s a history of chatbots, and things that work to try to convince people that you’re actually conversing with someone. We have a history to see how people react to that kind of thing. Now it’s being rolled out to way more people, and the technology is better than it used to be. But a lot of the concerns are the same, and I think a lot of the effects are are fairly predictable. One of the arguments that is made in Eliza is that if it helps you, does it matter that it’s a real person or not? Some of the people who are pro-technology say this, and that’s an argument that has been made about previous generations of chatbots as well. Before I worked on the game, I downloaded some apps that were like, “Talk to this bot and it will-”
Will: Woebot?
Matthew: Yeah, I talked to Woebot and stuff like that. Woebot just jammed me into this CBT-like thing: “What’s a thing that happened to you today that made you mad?” And then it was like, “Now rewrite it as an explanation.”
The argument to be made is that some people have pets, and they come home from a difficult day at work and they talk to their dog. “I had a bad day.” Your dog obviously is not understanding you. But the dog looks at you, and you feel like you’re kind of getting something off your chest. That can help you, right? So then apply that to fictional characters. Maybe you are pretending that you’re talking to someone and getting something off your chest. Maybe you’re talking to a thing on your computer that nods, and so there’s this like slippery slope of "Where does the reality of talking to someone else start and end?"
Because there are all these ways you can pretend to talk to someone else. Or convince yourself that you’re talking to someone else. I view the new chatbots and LLMs (large language models) on that spectrum. I don’t doubt that maybe some people talk to ChatGPT and actually feel better about themselves, even when ChatGPT just offers the most generic platitudes in response to what they have to say. Like maybe that’s what someone needs to hear.
At the same time, there’s a a danger of reducing actual human communication. And I think this was the thing that Joseph Weizenbaum was concerned about, very correctly. One of the points he makes in his books is that in the future, there maybe will be a computer completely capable of providing therapy - maybe there will be a computer that’s smart enough to do that. However, even if that’s the case, it is still important for humans to provide each other therapy.
Because it’s sort of the principle of the thing. It’s about humans helping each other. It shouldn’t be a computer. For moral reasons - that’s his argument. And I think that’s a very interesting argument, that human connection is actually the meaningful thing that we should be looking at. Not necessarily being a provider of a certain therapy, but real communication between human beings.
Alliance and "Gaming" the System of Therapy
Will: The research I usually circle back to is that the therapeutic alliance is the biggest predictor of success in therapy.
Matthew: Right.
Will: I’ve been researching Woebot and what they’re promoting. They’re like, “Well, people are relating quicker to robots.” and all this stuff, and it’s like, well… what are you doing?
Matthew: (Laughs)
Will: First off, of course they are, but secondly, it’s not the direct substitute, it’s not the work.
Matthew: Right. Right. Yeah, and to your point about the therapeutic alliance, finding someone who feels allied with you I think is so much more important than if a computer said all the right thing. I can see them claiming that people relate to computers easier, at first. Because they are sort of more polite right off the bat and maybe they have a better bedside manner.
People can be off-putting when you first meet them, that kind of thing. But I think there’s a limit. At the end of the day, you’re not going to be able to have that really, really deep conversation. I don’t think. Not with the tools the way they are now, you know?
Will: No, it’s like playing a game. Have you ever watched like any of Tim Rogers’s stuff?
Matthew: Oh yeah! Absolutely. Yeah, big fan.
Will: Yeah, in his Tokimeki Memorial video, where he reaches a conclusion that in order to "win" and date the girl on the box art, you have to treat her not as a human.
Matthew: Yeah (laughing) Yeah! That’s a really good point. So one thing that I’ve learned working in the game industry is that anytime you give people some kind of system that they can game, they will game it within an inch of its life. So if you give people a chat-bot, and say, “You are court mandated to talk to this chat-bot until such-and-such thing happens,” they will just figure out how to get it to do that output. They won’t take it seriously. Like when you put a game in front of someone, they try to figure out how to win it. Not how to necessarily really engage with it. If you can optimize it, that is what people will do.
Will: Yeah, the efficacy of therapy when a client feels coerced is often way lower.
Matthew: I can only imagine. And the person in therapy is maybe trying to find the the magic words to say to be let go from that. Right?
Will: Yep. Yeah, then they wouldn’t have to be there.
Matthew: Yeah.
On Gendered Professions
Will: I don’t know if this was intentional or not, but there are no male therapists in Eliza.
Matthew: Yeah, that is intentional. (laughs) The tech executives in the story end up being the men and older men. The work has been gendered. And that is a theme that I remember reading applies to therapy itself and therapy providers. Back when it was Sigmund Freud, it was like, “Oh, we’re men! We’re inventing therapy!” Right? (Will is laughing) Then it becomes a little more commodified. Then it’s like, “Oh, women are therapists…”
And that actually parallels something that happened with software development and other tech things where certain tech skills are looked at in different ways, and for some reason, more back-end programming is seen as a little bit more masculine coded, and then front-end programming and UI programming is seen as more feminine coded. Those things were on my mind. For sure.
Will: That’s really interesting. I saw something that was like ~20% of the people working in tech are women, and ~20% of therapists are men.
Matthew: Mhm. Interesting.
Will: Anything you want to plug?
Matthew: Not right now. I’ve been working on other commercial games. Eliza was something very near and dear to my heart, and it was great that I was able to make it. But it’s not the "Do it to make money" kind of game. It’s very much a "Do it cause you want to make it" sort of thing. I want to do more stuff like that, but it’s hard to get funding to make that kind of thing.
Will: Yeah…. I-I appreciate that you made it, and I think more therapists need to play it. And I’m excited to see what else you do. You were in touch with something with Eliza, and you’re probably in touch with things still.
Matthew: Thank you so much for reaching out.
Eliza is available right now on Nintendo Switch and PC/Mac (Steam, GOG, Itch).
Will Ard
LMSW, MBA