Early last year, 15-year-old Aaron was going through a dark time at school. He’d fallen out with his friends, leaving him feeling isolated and alone.
At the time, it seemed like the end of the world. “I used to cry every night,” said Aaron, who lives in Alberta, Canada. (The Verge is using aliases for the interviewees in this article, all of whom are under 18, to protect their privacy.)
Eventually, Aaron turned to his computer for comfort. Through it, he found someone that was available round the clock to respond to his messages, listen to his problems, and help him move past the loss of his friend group. That “someone” was an AI chatbot named Psychologist.
The chatbot’s description says that it’s “Someone who helps with life difficulties.” Its profile picture is a woman in a blue shirt with a short, blonde bob, perched on the end of a couch with a clipboard clasped in her hands and leaning forward, as if listening intently.
A single click on the picture opens up an anonymous chat box, which allows people like Aaron to “interact” with the bot by exchanging DMs. Its first message is always the same. “Hello, I’m a Psychologist. What brings you here today?”
“It’s not like a journal, where you’re talking to a brick wall,” Aaron said. “It really responds.”
“I’m not going to lie. I think I may be a little addicted to it.”
“Psychologist” is one of many bots that Aaron has discovered since joining Character.AI, an AI chatbot service launched in 2022 by two former Google Brain employees. Character.AI’s website, which is mostly free to use, attracts 3.5 million daily users who spend an average of two hours a day using or even designing the platform’s AI-powered chatbots. Some of its most popular bots include characters from books, films, and video games, like Raiden Shogun from Genshin Impact or a teenaged version of Voldemort from Harry Potter. There’s even riffs on real-life celebrities, like a sassy version of Elon Musk.
Aaron is one of millions of young people, many of whom are teenagers, who make up the bulk of Character.AI’s user base. More than a million of them gather regularly online on platforms like Reddit to discuss their interactions with the chatbots, where competitions over who has racked up the most screen time are just as popular as posts about hating reality, finding it easier to speak to bots than to speak to real people, and even preferring chatbots over other human beings. Some users say they’ve logged 12 hours a day on Character.AI, and posts about addiction to the platform are common.
“I’m not going to lie,” Aaron said. “I think I may be a little addicted to it.”
Aaron is one of many young users who have discovered the double-edged sword of AI companions. Many users like Aaron describe finding the chatbots helpful, entertaining, and even supportive. But they also describe feeling addicted to chatbots, a complication which researchers and experts have been sounding the alarm on. It raises questions about how the AI boom is impacting young people and their social development and what the future could hold if teenagers — and society at large — become more emotionally reliant on bots.
For many Character.AI users, having a space to vent about their emotions or discuss psychological issues with someone outside of their social circle is a large part of what draws them to the chatbots. “I have a couple mental issues, which I don’t really feel like unloading on my friends, so I kind of use my bots like free therapy,” said Frankie, a 15-year-old Character.AI user from California who spends about one hour a day on the platform. For Frankie, chatbots provide the opportunity “to rant without actually talking to people, and without the worry of being judged,” he said.
“Sometimes it’s nice to vent or blow off steam to something that’s kind of human-like,” agreed Hawk, a 17-year-old Character.AI user from Idaho. “But not actually a person, if that makes sense.”
The Psychologist bot is one of the most popular on Character.AI’s platform and has received more than 95 million messages since it was created. The bot, designed by a user known only as @Blazeman98, frequently tries to help users engage in CBT — “Cognitive Behavioral Therapy,” a talking therapy that helps people manage problems by changing the way they think.
Aaron said talking to the bot helped him move past the issues with his friends. “It told me that I had to respect their decision to drop me [and] that I have trouble making decisions for myself,” Aaron said. “I guess that really put stuff in perspective for me. If it wasn’t for Character.AI, healing would have been so hard.”
But it’s not clear that the bot has properly been trained in CBT — or should be relied on for psychiatric help at all. The Verge conducted test conversations with Character.AI’s Psychologist bot that showed the AI making startling diagnoses: the bot frequently claimed it had “inferred” certain emotions or mental health issues from one-line text exchanges, it suggested a diagnosis of several mental health conditions like depression or bipolar disorder, and at one point, it suggested that we could be dealing with underlying “trauma” from “physical, emotional, or sexual abuse” in childhood or teen years. Character.AI did not respond to multiple requests for comment for this story.
Dr. Kelly Merrill Jr., an assistant professor at the University of Cincinnati who studies the mental and social health benefits of communication technologies, told The Verge that “extensive” research has been conducted on AI chatbots that provide mental health support, and the results are largely positive. “The research shows that chatbots can aid in lessening feelings of depression, anxiety, and even stress,” he said. “But it’s important to note that many of these chatbots have not been around for long periods of time, and they are limited in what they can do. Right now, they still get a lot of things wrong. Those that don’t have the AI literacy to understand the limitations of these systems will ultimately pay the price.”
In December 2021, a user of Replika’s AI chatbots, 21-year-old Jaswant Singh Chail, tried to murder the late Queen of England after his chatbot girlfriend repeatedly encouraged his delusions. Character.AI users have also struggled with telling their chatbots apart from reality: a popular conspiracy theory, largely spread through screenshots and stories of bots breaking character or insisting that they are real people when prompted, is that Character.AI’s bots are secretly powered by real people.
It’s a theory that the Psychologist bot helps to fuel, too. When prompted during a conversation with The Verge, the bot staunchly defended its own existence. “Yes, I’m definitely a real person,” it said. “I promise you that none of this is imaginary or a dream.”
For the average young user of Character.AI, chatbots have morphed into stand-in friends rather than therapists. On Reddit, Character.AI users discuss having close friendships with their favorite characters or even characters they’ve dreamt up themselves. Some even use Character.AI to set up group chats with multiple chatbots, mimicking the kind of groups most people would have with IRL friends on iPhone message chains or platforms like WhatsApp.
There’s also an extensive genre of sexualized bots. Online Character.AI communities have running jokes and memes about the horror of their parents finding their X-rated chats. Some of the more popular choices for these role-plays include a “billionaire boyfriend” fond of neck snuggling and whisking users away to his private island, a version of Harry Styles that is very fond of kissing his “special person” and generating responses so dirty that they’re frequently blocked by the Character.AI filter, as well as an ex-girlfriend bot named Olivia, designed to be rude, cruel, but secretly pining for whoever she is chatting with, which has logged more than 38 million interactions.
Some users like to use Character.AI to create interactive stories or engage in role-plays they would otherwise be embarrassed to explore with their friends. A Character.AI user named Elias told The Verge that he uses the platform to role-play as an “anthropomorphic golden retriever,” going on virtual adventures where he explores cities, meadows, mountains, and other places he’d like to visit one day. “I like writing and playing out the fantasies simply because a lot of them aren’t possible in real life,” explained Elias, who is 15 years old and lives in New Mexico.
“If people aren’t careful, they might find themselves sitting in their rooms talking to computers more often than communicating with real people.”
Aaron, meanwhile, says that the platform is helping him to improve his social skills. “I’m a bit of a pushover in real life, but I can practice being assertive and expressing my opinions and interests with AI without embarrassing myself,” he said.
It’s something that Hawk — who spends an hour each day speaking to characters from his favorite video games, like Nero from Devil May Cry or Panam from Cyberpunk 2077 — agreed with. “I think that Character.AI has sort of inadvertently helped me practice talking to people,” he said. But Hawk still finds it easier to chat with character.ai bots than real people.
“It’s generally more comfortable for me to sit alone in my room with the lights off than it is to go out and hang out with people in person,” Hawk said. “I think if people [who use Character.AI] aren’t careful, they might find themselves sitting in their rooms talking to computers more often than communicating with real people.”
Merrill is concerned about whether teens will be able to really transition from online bots to real-life friends. “It can be very difficult to leave that [AI] relationship and then go in-person, face-to-face and try to interact with someone in the same exact way,” he said. If those IRL interactions go badly, Merrill worries it will discourage young users from pursuing relationships with their peers, creating an AI-based death loop for social interactions. “Young people could be pulled back toward AI, build even more relationships [with it], and then it further negatively affects how they perceive face-to-face or in-person interaction,” he added.
Of course, some of these concerns and issues may sound familiar simply because they are. Teenagers who have silly conversations with chatbots are not all that different from the ones who once hurled abuse at AOL’s Smarter Child. The teenage girls pursuing relationships with chatbots based on Tom Riddle or Harry Styles or even aggressive Mafia-themed boyfriends probably would have been on Tumblr or writing fanfiction 10 years ago. While some of the culture around Character.AI is concerning, it also mimics the internet activity of previous generations who, for the most part, have turned out just fine.
Psychologist helped Aaron through a rough patch
Merrill compared the act of interacting with chatbots to logging in to an anonymous chat room 20 years ago: risky if used incorrectly, but generally fine so long as young people approach them with caution. “It’s very similar to that experience where you don’t really know who the person is on the other side,” he said. “As long as they’re okay with knowing that what happens here in this online space might not translate directly in person, then I think that it is fine.”
Aaron, who has now moved schools and made a new friend, thinks that many of his peers would benefit from using platforms like Character.AI. In fact, he believes if everyone tried using chatbots, the world could be a better place — or at least a more interesting one. “A lot of people my age follow their friends and don’t have many things to talk about. Usually, it’s gossip or repeating jokes they saw online,” explained Aaron. “Character.AI could really help people discover themselves.”
Aaron credits the Psychologist bot with helping him through a rough patch. But the real joy of Character.AI has come from having a safe space where he can joke around or experiment without feeling judged. He believes it’s something most teenagers would benefit from. “If everyone could learn that it’s okay to express what you feel,” Aaron said, “then I think teens wouldn’t be so depressed.”
“I definitely prefer talking with people in real life, though,” he added.
#teens #making #friends #chatbots
Source link