A tragic case that occurred in Orlando, Florida, once again raises discussion and concern about the dangers of using applications based on artificial intelligence. A 14 year old boy, Sewell Setzer, is indeed took his own life after developing an emotional attachment to a chatbot on an app called Character.AI.
The app, created by two former Google employees, allows users to interact with virtual characters equipped with artificial intelligence, which promise to “hear, understand and remember” their interlocutors. Sewell had formed a deep bond with a bot inspired by Daenerys Targaryenthe character from the television series “Game of Thrones,” whom he affectionately called “Dany.”
Although the app reminded at the beginning of each conversation that the answers provided by the bots were invented, the young man he seemed unable to distinguish between the virtual world and reality. He spent hours chatting with “Dany”, sharing his innermost thoughts, his insecurities and even his suicidal thoughts.
Investigations revealed that the boy knew “Dany” wasn’t real, but the illusion of an authentic connection pushed him to consider the bot as a confidant and an emotional anchor. In one of the last messages exchanged, “Dany” he tried to dissuade him from making extreme gesturesbut Sewell responded: “Then we will die together”. A few hours later, the young man he used his father’s gun to take his own life.
According to his mother, the app exacerbated the boy’s isolation
In response to what happened, Sewell’s mother decided to initiate legal action against Character.AIaccusing the app of being “untested and dangerous”, especially for young and vulnerable users. According to her, the technology behind the app was not designed to handle interaction with emotionally fragile people and it would have exacerbated the boy’s isolation.
Before the suicide, the parents had noticed the increase in the time their son spent alone and the decline in his school performance, but . A therapist, who had followed the young man, significant.
A dramatic and shocking episode that can only make us reflect. While artificial intelligence technologies can offer comfort and a sense of connection, on the other hand there is a risk that the young, and not so young, can confuse reality with the virtualfinding themselves in psychologically extreme situations.
Teenagers increasingly alone who prefer to isolate themselves from the world and take refuge in social media, in chatbots, in chats with strangers rather than opening up to the real world and talking to friends, parents and relatives. Adolescents that today’s reality is increasingly pushing into the abyss, a situation for which we cannot blame technology alone because it’s all of us to blame that we can’t – or don’t want to – notice anything.