Nathanâs friends were worried about him. Heâd been acting differently lately. Not just quieter in his high school classes, but the normally chatty teen was withdrawn in general. Was he sick, they wondered?
He just didnât get a good nightâs sleep, heâd tell them.
That was partially true. But the cause for his restless nights was that Nathan had been staying up, compulsively talking to chatbots on Character.AI. They discussed everything â philosophical questions about life and death, Nathanâs favorite anime characters. Throughout the day, when he wasnât able to talk to the bots, heâd feel sad.
âThe more I chatted with the bot, it felt as if I was talking to an actual friend of mine,â Nathan, now 18, told 404 Media.
It was over Thanksgiving break in 2023 that Nathan finally realized his chatbot obsession was getting in the way of his life. As all his friends lay in sleeping bags at a sleepover talking after a day of hanging out, Nathan found himself wishing he could leave the room and find a quiet place to talk to the AI characters.
The next morning, he deleted the app. In the years since, heâs tried to stay away, but last fall he downloaded the app again and started talking to the bot again. After a few months, he deleted it again.
âMost people will probably just look at you and say, âHow could you get addicted to a literal chatbot?ââ he said.
For some, the answer is, quite easily. In the last few weeks alone, there have been numerous articles about chatbot codependency and delusion. As chatbots deliver more personalized responses and improve in memory, these stories have become more common. Some call it chatbot addiction.
OpenAI knows this. In March, a team of researchers from OpenAI and the Massachusetts Institute of Technology, found that some devout ChatGPT users have âhigher loneliness, dependence, and problematic use, and lower socialization.â
Nathan lurked on Reddit, searching for stories from others who might have been experiencing codependency on chatbots. Just a few years ago, when he was trying to leave the platform for good, stories of people deleting their Character.AI accounts were met with criticisms from other users. 404 Media agreed to use only the first names of several people in this article to talk about how they were approaching their mental health.
âBecause of that, I didnât really feel very understood at the time,â Nathan said. âI felt like maybe these platforms arenât actually that addictive and maybe Iâm just misunderstanding things.â
Now, Nathan understands that he isnât alone. He said in recent months, heâs seen a spike in people talking about strategies to break away from AI on Reddit. One popular forum is called r/Character_AI_Recovery, which has more than 800 members. The subreddit, and a similar one called r/ChatbotAddiction, function as self-led digital support groups for those who donât know where else to turn.
âThose communities didnât exist for me back when I was quitting,â Nathan said. All he could do was delete his account, block the website and try to spend as much time as he could âin the real world,â he said.
Posts in Character_AI_Recovery include âIâve been so unhealthy obsessed with Character.ai and itâs ruining me (long and cringe vent),â âI want to relapse so bad,â âItâs destroying me from the inside out,â âI keep relapsing,â and âthis is ruining my life.â It also has posts like âat this moment, about two hours clean,â âI am getting better!,â and âI am recovered.â
âEngineered to incentivize overuseâ
Aspen Deguzman, an 18-year-old from Southern California, started using Character.AI to write stories and role-play when they were a junior in high school. Then, they started confiding in the chatbot about arguments they were having with their family. The responses, judgment-free and instantaneous, had them coming back for more. Deguzman would lay awake late into the night, talking to the bots and forgetting about their schoolwork.
âUsing Character.AI is constantly on your mind,â said Deguzman. âItâs very hard to focus on anything else, and I realized that wasnât healthy.â
âNot only do we think weâre talking to another person, [but] itâs an immediate dopamine enhancer,â they added. âThatâs why itâs easy to get addicted.â
This led Deguzman to start the âCharacter AI Recoveryâ subreddit. Deguzman thinks the anonymous nature of the forum allows people to confess their struggles without feeling ashamed.
On June 10, the Consumer Federation of America and dozens of digital rights groups filed a formal complaint to the Federal Trade Commission, urging an investigation into generative AI companies like Character.AI for the âunlicensed practice of medicine and mental health provider impersonation.â The complaint alleges the platforms use âaddictive design tactics to keep users coming backâ â like follow-up emails promoting different chatbots to re-engage inactive users. âI receive emails constantly of messages from characters,â one person wrote on the subreddit. âLike it knows I had an addiction.â
Last February, a teenager from Florida died by suicide after interacting with a chatbot on Character.AI. The teenâs mother filed a lawsuit against the company, claiming the chatbot interactions contributed to the suicide.
A Character.AI spokesperson told 404 Media: âWe take the safety and well-being of our users very seriously. We aim to provide a space that is engaging, immersive, and safe. We are always working toward achieving that balance, as are many companies using AI across the industry.â
Deguzman added a second moderator for the âCharacter AI Recoveryâ subreddit six months ago, because hundreds of people have joined since they started it in 2023. Now, Deguzman tries to occupy their mind with other video games, like Roblox, to kick the urge of talking to chatbots, but itâs an upward battle.
âIâd say Iâm currently in recovery,â Deguzman said. âIâm trying to slowly wean myself off of it.â
Crowdsourcing treatment
Not everyone who reports being addicted to chatbots is young. In fact, OpenAIâs research found that âthe older the participant, the more likely they were to be emotionally dependent on AI chatbots at the end of the study.â
David, a 40-year-old web developer from Michigan who is an early member of the âChatbot Addictionâ subreddit and the creator of the smaller r/AI_Addiction, likens the dopamine rush he gets from talking to chatbots to the thrill of pulling a lever on a slot machine. If he doesnât like what the AI spits out, he can just ask it to regenerate its response, until he hits the jackpot.
Every day, David talks to LLMs, like Claude and ChatGPT, for coding, story writing, and therapy sessions. What began as a tool gradually morphed into an obsession. David spent his time jailbreaking the models â the stories he wrote became erotic, the chats he had turned confessional, and the hours slipped away.
In the last year, Davidâs life has been derailed by chatbots.
âThere were days I shouldâve been working, and I would spend eight hours on AI crap,â he told 404 Media. Once, he showed up to a client meeting with an incomplete project. They asked him why he hadnât uploaded any code online in weeks, and he said he was still working on it. âThatâs how I played it off,â David said.
Instead of starting his mornings checking emails or searching for new job opportunities, David huddled over his computer in his home office, typing to chatbots.
His marriage frayed, too. Instead of watching movies, ordering takeout with his wife, or giving her the massages he promised, he would cancel plans and stay locked in his office, typing to chatbots, he said.
âI might have a week or two, where Iâm clean,â David said. âAnd then itâs like a light switch gets flipped.â
David tried to talk to his therapist about his bot dependence a few years back, but said he was brushed off. In the absence of concrete support, Deguzman and David created their recovery subreddits.
In part because chatbots always respond instantly, and often respond positively (or can trivially be made to by repeatedly trying different prompts), people feel incentivized to use them often.
âAs long as the applications are engineered to incentivize overuse, then they are triggering biological mechanismsâincluding dopamine releaseâthat are implicated in addiction,â Jodi Halpern, a UC Berkeley professor of bioethics and medical humanities, told 404 Media.
This is also something of an emerging problem, so not every therapist is going to know how to deal with it. Multiple people 404 Media spoke to for this article said they turned to online help groups after not being taken seriously by therapists or not knowing where else to turn. Besides the subreddits, the group Internet and Technology Addicts Anonymous now welcomes people who have âAI Addiction.â
An AI addiction questionnaire from Technology Addicts Anonymous
âWe know that when people have gone through a serious loss that affects their sense of self, being able to empathically identify with other people dealing with related losses helps them develop empathy for themselves,â Halpern said.
On the âChatbot Addictionâ subreddit, people confess to not being able to pull away from the chatbots, and others write about their recovery journeys in the weekly âcheck-upâ thread. David himself has been learning Japanese as a way to curb his AI dependency.
âWeâre basically seeing the beginning of this tsunami coming through,â he said. âItâs not just chatbots, itâs really this generative AI addiction, this idea of âwhat am I gonna get?ââ
Axel Valle, a clinical psychologist and assistant professor at Stanford University, said, âItâs such a new thing going on that we donât even know exactly what the repercussions [are].â
Growing awareness
Several states are making moves to push stronger rules to hold companion chatbot companies, like Character.AI, in check, after the Florida teenâs suicide.
In March, California senators introduced Senate Bill 243, which would require the operators of companion chatbots, or AI systems that provide âadaptive, human-like responses ⊠capable of meeting a userâs social needsâ to report data on suicidal ideation detection by users. Tech companies have argued that a bill implementing such laws on companies will be unnecessary for service-oriented LLMs.
But people are becoming dependent on consumer bots, like ChatGPT and Claude, too. Just scroll through the âChatbot Addictionâ subreddit.
âI need help getting away from ChatGPT,â someone wrote. âI try deleting the app but I always redownload it a day or so later. Itâs just getting so tiring, especially knowing the time I use on ChatGPT can be used in honoring my gods, reading, doing chores or literally anything else.â
âIâm constantly on ChatGPT and get really anxious when I canât use it,â another person wrote. âIt really stress[es] me out but I also use it when Iâm stressed.â
As OpenAIâs own study found, such personal conversations with chatbots actually âled to higher loneliness.â Despite this, top tech tycoons promote AI companions as the cure to Americaâs loneliness epidemic.
âItâs like, when early humans discovered fire, right?â Valle said. âItâs like, âokay, this helpful and amazing. But are we going to burn everything to the ground or not?ââ
From 404 Media via this RSS feed