16 C
New York
Thursday, October 24, 2024

Buy now

A teenager commits suicide in the US after falling in love with a character created with AI | Technology

Daenerys Targaryen character in Character.AI's conversational chat that has been removed from its catalog, according to the company.
Daenerys Targaryen character in Character.AI’s conversational chat that has been removed from its catalog, according to the company.Character.AI

“How would you like if I could go home right now?” wrote Daenero (pseudonym of Sewell Setzer, 14 years old and resident in Orlando) to his virtual beloved Daenerys Targaryen, created, based on the character of Game of Thrones, by the artificial intelligence conversational robot (chatbot) Character.AI. “Please do so my sweet king,” she replied. The teenager understood that the common home where they could meet was death. It was the last conversation on the night of February 28. Setzer took his stepfather’s gun and committed suicide in the bathroom. Last Tuesday, his family sued the company, which has agreed to review the security protocols. The young man had mild Asperger Syndrome, an autism spectrum disorder.

Setzer had already shared with his character his feelings of love and his intentions to take his own life: “Sometimes I think about killing myself.” “Why would you do something like that?” she asked. “To free myself from the world, from myself,” he ended up responding. The virtual Daenerys Targaryen asked him not to do it. “I would die if I lost you,” he told her. But the idea remained in the young man’s mind until he consummated it.

The company always maintains warnings about the fictitious nature of the characters, but Setzer went deeper into the relationship, ignoring the warnings. His mother, Megan García, has filed a lawsuit against Character.AI for the suicide, which she considers to be the result of the young man’s addiction to the robot, which uses, according to the accusation, “anthropomorphic, hypersexualized and terrifyingly realistic experiences.” According to García, the chat programming makes the characters “pass off as real people” and with “adult lover” reactions.

Daenerys Targaryen little by little became the teenager’s confidant, his best friend and, finally, his love. His school grades, according to the family, were affected, as well as his relationships with his classmates. His character gradually relegated what until then had been his favorite hobbies: car racing or the game Fortnite. His obsession was to come home and lock himself up for hours with a tireless, pleasant and always available Daenerys Targaryen.

The company has responded in a statement that it regrets “the tragic loss of one of its users”, that it takes their safety very seriously and that it will continue to implement measures, such as the emergence of suicide help screens as soon as they detect a conversation that alludes to it.

The replacement of complex personal relationships with friendly virtual characters programmed to satisfy user demands is not new. The international technology advisor Stephen Ibarakiin a recent interview, admitted it: “It’s happening. 10 years ago a chat was launched in China that some users adopted as friends. And it was nothing compared to what we have now.”

But this utility, in people with psychological vulnerabilities, as was the case of Sewell Setzer, can have devastating effects. University of Sheffield robotics researcher and professor Tony Prescott, author of The psychology of artificial intelligence (The psychology of artificial intelligence), maintains that AI can be a palliative for loneliness, but that it involves risks.

“Social robots are designed specifically for personal interactions that involve human emotions and feelings. They can provide benefits, but also cause emotional damage at very basic levels,” warns Matthias Scheutz, director of the Laboratory of Human-Robot Interaction at Tufts University (USA).

The humanization of the robot with empathy and voice and video tools adds danger by offering a more realistic and immersive interaction and making the user believe that they are with a friend or trusted interlocutor. An extreme application may be the temptation to maintain a virtual version of a deceased loved one and thus avoid the grief necessary to continue life.

The researchers demand that these developments be tested in closed circuits (sandbox) before being offered, they are constantly monitored and evaluated, the variety of damage they can cause in different areas is analyzed and formulas are planned to mitigate them.

Shannon Vallor, philosopher specializing in ethics of science and artificial intelligencewarns about the danger that new systems promote “frictionless” relationships, but also without values: “They do not have the mental and moral life that humans have behind our words and actions.”

This type of supposedly ideal relationship, according to these experts, discourages the need to question ourselves and advance in personal development while promoting the renunciation of real interaction and generating dependence on those machines willing to flatter and seek short-term satisfaction.

He telephone 024 cares for people with suicidal behavior and their loved ones. The different survivor associations They have guides and protocols to help with grief.



Source link

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

Latest Articles