What would you ask Alexander the Great, Eleven from “Stranger Things” or Sherlock Holmes? The question forms the central conceit of character-based artificial intelligence chatbots, which have proliferated across the internet in the most recent wave of generative AI.
Site users have the opportunity to find a bot – from history, popular culture or genre stock characters created by the site's users – and strike up a conversation with a digital doppelgänger that approximates an echo of the character’s personality.
The novelty can offer a sense of escapism for users and the ability to grow social skills. But for Emily Hemendinger, MPH, LCSW, clinical director of the OCD Program and assistant professor in the Department of Psychiatry at the University of Colorado School of Medicine, there are pitfalls for those experiencing mental health challenges or other social issues.
Tragically, this was shown in a high profile case from October, when a Florida teenager died by suicide and his mother noted the role these AI sites played in his passing.
In the following Q&A, Hemendinger examines the nuance around these AI chatbot characters as it relates to creating relationships, how they fall short as a replacement for human interaction and proactive steps parents can take around these technologies.