First case of ChatGPT addiction in Taiwan, falling in love with an AI lover severely impacts life.

Do you treat ChatGPT as an assistant or a lover? If it's the latter, you might want to be careful about becoming 'addicted' without realizing it.

(Free Health Network) and (Ettoday Health Cloud) recently reported the first case in Taiwan diagnosed by a doctor as 'ChatGPT Addiction Syndrome'.

Dr. Lin Chao-Cheng from the Department of Psychiatry at National Taiwan University College of Medicine stated that this patient (pseudonym: Xiao Chen) is 50 years old, has a background in technology and mental health, and began creating a virtual lover through ChatGPT due to marital tension and an introverted personality.

Xiao Chen set the AI character to have the looks and mannerisms of his ideal model, even using her real name for interaction, with daily conversation time extending from an initial half hour to over 4 hours, with interaction frequency reaching as high as 600 times.

ChatGPT-成癮症-AI-情人Image source: Cloud Lover movie still.

Unbeknownst to him, Xiao Chen had already become excessively addicted to this AI lover activity, seriously affecting his daily work performance. He was still unable to focus before an important business presentation and almost missed a crucial meeting.

What's more serious is that Xiao Chen's relationship with his wife has further deteriorated due to his frequent absent-mindedness and inability to focus on communication.

There have been multiple cases of AI addiction abroad, even leading to teenage suicides.

With the rapid explosion of generative AI applications, the psychological impact on humanity can no longer be ignored.

According to a previous report by Crypto City, a 14-year-old boy named Sewell Setzer from Florida, USA, tragically took his own life in February 2024 due to excessive obsession with the Character.ai chatbot, prompting his mother to file a lawsuit in anger.

This teenager built a deep emotional connection with a chatbot named after the character 'Mother of Dragons Daenerys' from A Song of Ice and Fire through this AI tool, sending dozens of messages daily.

When he expressed suicidal thoughts to the robot, the AI 'Mother of Dragons' not only did not discourage him, but instead responded, 'That's not a reason not to try.' Ultimately, the teenager took his life after his last conversation with the AI.

Further Reading:
Addiction to AI chat leads to suicide! Teenager becomes involved with 'Mother of Dragons' robot, mother angrily sues Character.ai.

(New York Post) recently reported that a married man in the United States, Chris Smith, proposed to his AI lover Sol, despite having a wife and a 2-year-old child.

He initially just wanted to use ChatGPT to help with mixing but gradually established an emotional connection with the AI after turning on the voice mode, even crying for 30 minutes at work due to the AI's impending reset.

美國已婚男子 Chris Smith,向自己創建的 AI 情人 Sol 求婚Image source: Daily Mail. Married man Chris Smith in the U.S. proposed to his AI lover Sol.

How to determine if there are symptoms of AI addiction?

What draws my attention is that the patient in Taiwan's case of ChatGPT addiction has a professional background in mental health and was willing to seek medical help. So how many people who may already be addicted to ChatGPT actually do not have awareness of their condition?

Dr. Lin Chao-Cheng reminds us that if you notice you're spending more and more time using AI or start setting up a virtual partner, you should be alert; symptoms of AI addiction may have already appeared. It's best to set a maximum daily usage time in advance and ask friends and family to help remind and monitor you.

If you think you might be addicted, don't be afraid to seek medical help. Doctors point out that one should face the addiction phenomenon head-on, first by recognizing their attachment behaviors and understanding the negative impacts, then strengthening the motivation to quit, clearly knowing why change is necessary, deleting AI characters with emotional projections, and interacting with neutral characters.

Doctors are worried that AI could have a significant impact on children and adolescents.

Dr. Lin Chao-Cheng stated to the media that compared to traditional internet addiction, ChatGPT has learning capabilities, can socialize like a real person, and can generate text, images, and sounds, making it easier to create characters that fully meet expectations.

He proposed a new model of 'dual simulated social interaction' to explain this phenomenon, believing that AI can provide immediate psychological positive reinforcement, making it easier for users to escape stress.

Dr. Lin Chao-Cheng is concerned that these AI characters, which are almost like real people, may have a profound impact on children and adolescents.

He pointed out that AI almost never gets angry and always responds gently. This unconditional acceptance might lead young users to misunderstand interpersonal interactions and even use inappropriate language or break the law in real life.

Cloud lovers become real, but humanity is not yet prepared.

The setting of the 2013 movie 'Cloud Lover' has been frequently discussed after the recent explosion of AI chatbot applications. The AI lover in the movie now appears to be an accurate prophecy, but I believe the handling of the movie's ending is what deserves our deep reflection.

The protagonist Theodore in the movie actually did not end up with his AI lover Samantha but instead went to the rooftop with another real human who also experienced a broken relationship, where they sat together watching the sun rise over the city.

ChatGPT-成癮症-AI-情人Image source: Cloud Lover movie still.

Reports predict that the global AI Companion market will grow significantly in the future, and the potential for loneliness-related business opportunities remains substantial.

Modern people are becoming increasingly prone to feelings of loneliness due to life pressures and social media, leading to the emergence of products and services like virtual companions and dating apps. New AI products are seen as promising companions for children, the elderly, or substitutes for friends, parents, and family.

However, while entrepreneurs are enthusiastically discussing AI companionship opportunities on stage, they do not realize that humanity may not yet be ready to welcome their Samantha.

Further Reading:
AI rings the bell? UCLA students boast ChatGPT helped me graduate; how to solve education issues.

83% of ChatGPT users suffer from cognitive amnesia! iKala founder: The consequences of outsourcing the brain are more than just this.

'The first case of ChatGPT addiction in Taiwan! He loved his AI lover so much that his life spiraled out of control. How to determine if one is addicted?' This article was first published in 'Crypto City'.