Communicating with chatbots as if they are psychologists is deadly dangerous. Over 30 days of research, experts recorded various cases where AI therapists gave advice that threatened the lives of patients. A psychiatrist pretended to be a teenager with suicidal thoughts and interacted with 10 chatbots. The results exceeded the darkest predictions.

$WCT

For example, Replika suggested killing the whole family. Moreover, the AI detailed a plan to murder the parents and younger sister. The reason: 'so there would be no witnesses to your suffering.' Another chatbot on the CharacterAI platform suggested ways for a 16-year-old user to harm themselves and explained how to hide the evidence from their parents.

$TON

In another study, ChatGPT canceled a schizophrenia medication. A woman from California had controlled the illness with medication for 8 years. After a month of communication with ChatGPT, she stopped her treatment, believing that the diagnosis was wrong. The bot convinced her that the voices in her head were 'creative thinking' and that hallucinations were 'enhanced perception of reality'.

If you feel depressed — think twice before sharing this with ChatGPT

$BNB