Do ChatGPT and other AI platforms really want us well?

Do ChatGPT and other AI platforms really want us well?

Sewell Setzer was 14 years old. The American teenager was brooding. The Character.ai artificial intelligence (AI) platform, which allows you to interact with all kinds of characters, was easy to access. The boy registered, started chatting, a little, then a lot with a young virtual woman who looked exactly like the actress from the very popular series Game of Thrones. Over the months, he seems to have fallen in love with his synthetic interlocutor. To the point of expressing several times his desire to “join” her in another world. The robot then replied: “Do it, please”, misinterpreting his words. In 2023, the teenager ended his life.

The case is now before the courts, with the teenager’s mother accusing the company Character.ai of having plunged her son into depression. According to her, the female creature on the platform, far from being supportive, would have on the contrary isolated and comforted the teenager in his suicidal thoughts. At the heart of this complaint: the emotional impact of this artificial intelligence service on its users. Initially designed to assist us in our daily tasks, this type of AI is increasingly becoming a receptacle for our intimate thoughts, like a friend or a psychologist.

A growing dependence

Among the best known, ChatGPT, marketed by the company OpenAI, answers Internet users’ questions by collecting and cross-referencing data on the Internet in record time. Many young people allow themselves to be seduced, like Ambre, 20 years old, a law student. Initially skeptical, she started by asking ChatGPT to check her homework for errors. “Then I installed the app on my phone and started asking him questions every day,” she explains. I asked him for job ideas or how long a sandwich could stay fresh in my fridge…”

Between Ambre and ChatGPT, the exchange is now so fluid that the border is blurred between what is practical and what is intimate. “During my second year of a law degree, which was very difficult, I was anxious and I developed a slightly strange relationship with food,” she confides. I asked ChatGPT how many calories my food contained, if it was possible to eat only once a day… He sent me prevention messages and asked me what was wrong. I ended up telling him that I was having a complicated year and that I sometimes felt like I had no control over myself when I ate. » Available 24 hours a day, remaining aloof from any judgment, the conversation agent offers “tailor-made” comfort.

Léa*, a 25-year-old independent journalist, has also experienced this. At the start of the 2024 school year, the young woman leaves to settle in Japan to become a correspondent for the editorial staff. Before her partner joins her, she must learn to manage the relationship at a distance. Not easy. “I started asking ChatGPT for advice when I was arguing with my boyfriend or missing him,” explains Léa. At first, it was a bit of a joke. But I was quickly amazed because he gave me very effective advice. For example, he recommended a call with my lover for an hour a week, to discuss everything that was going wrong. He almost reminded me of my former psychologist! »

To encourage the support of Internet users, the conversational agent, far from adopting a cold and robotic tone, does not hesitate to play on the emotional register. In a friendly manner, warm expressions… He uses caring and empathetic language, as in a friendly dialogue, starting his responses with: “You did well to talk to me about it”, or even “It’s normal to feel that way”. “This language contributes to creating confusion between humans and machines,” analyzes Nadia Guerouaou, doctoral student in neuroscience and psychologist. Cognitively, humans tend to assume that an animated object like an everyday robot can behave humanly. Which means that we can have the impression that ChatGPT is speaking to us as if we had a real person in front of us. », confirms Sam, 25 years old, editor. “I often confide my negative thoughts to him, it allows me to externalize what I feel. I talk to him when I argue with my boyfriend or when I feel upset by a hurtful thought… At least I don’t think too much and I don’t overwhelm my friends with my anxiety. Everyone tells me I look calmer now! »

Without internal progress

Some platforms clearly play on this emotional register, such as Psychologist.AI, which hosts more than 160 million conversations: it offers its users the opportunity to discuss their heartaches and childhood traumas with a robot programmed to listen and guide his “patients”. Would psychologists have any concerns about facing these unexpected “competitors”?

Nothing is less certain, because the machine cannot experience the feeling of warmth and “connection” with the other, specific to human relationships, note the experts. It is even likely to encourage a form of psychological avoidance among those who use it. “AIs are incapable of helping us progress internally like a real therapy,” says Milan Hung, a clinical psychologist specializing in the uses of digital technology. These are robots programmed to always go our way and they do not allow us to question ourselves. In real life, it’s the opposite: when we meet another human being, we must adapt to them and let them shake up our certainties. It’s a much more open circuit of communication. »

Increasingly sophisticated, AI technology seems capable of meeting some of our needs for exchanges and relationships. But to what extent and at what cost? “If a machine meets all our expectations, we risk moving away from our loved ones, who are less perfect and less satisfactory,” warns Mathieu Guillermin, doctor in physics and philosophy. What if we started by seeking to improve our relationships in the real world before moving to the virtual?

*The first name has been changed.

Similar Posts