Chatbots: synthetic intimacy
Three in the morning, a teenager can't stop thinking about a stranger's comment on social media. She'd like to ask her best friend for advice, but she doesn't want to wake her up. She picks up her phone and starts whispering. Eleven at night, another teenager is doing homework when a playground argument comes back to him, swirling around in his head. He trembles, he wants to text his best friend for advice, but he's embarrassed. And he also picks up his phone and starts whispering. Both opt for a synthetic confidant who will attend to them at odd hours and won't judge them. A confidant designed to generate emotional dependency. In two decades, the digital economy has undergone enormous evolution: it started with data, until with social media platforms we discovered that attention was the scarce resource to compete for. Today, with the irruption of chatbots, what they monetize is our need to connect. Emotional support is one of the three most common uses of chatbots, both for adolescents and for the adult population. And the reason is that they promise availability, immediacy, and constant validation, generating a sense of complicity from synthetic intimacy – otherwise known as the attachment economy.
Facing this reality, one of the emerging fears is the substitutive potential of these conversational interfaces. That is, if we turn to them because we feel lonely, will we end up lonely because it will be easier for us to relate to programs than to people?
A recent study indicates that these interactions help alleviate discomfort at the moment, but that it is a temporary improvement. The experiment was conducted in Canada with 300 university students. It is relevant that they were first-year students, as life stage changes can lead to loneliness or require time to build connections in the new community. The students were divided into three groups and asked to participate for two weeks. The first was the control group, which was monitored: at the end of the day, they had to write their diary on the Discord platform, explaining how they felt. The second was the group that interacted with Sam, a chatbot based on some psychological support techniques. The third group talked with another participant in the experiment, assigned randomly, meaning it was not a chosen relationship nor necessarily a deep friendship. The curiosity is that all groups experienced some reduction in loneliness, but human interaction –even if it was with any stranger– was what generated better results.
The first conclusion is that the control group also registered lower levels of loneliness at the end of the period, even though they had not been applied any specific intervention beyond receiving a daily message reminding them to fill out their personal diary. The second derivative is that conversing with Sam for two weeks had similar effects to the group that only wrote a personal diary. Simply pouring out worries is already a way to empty thoughts, order emotions, and explore oneself. However, this space of solitude and self-observation is necessary but not sufficient. We lack the other. In short, and no matter how well trained Sam is, there is no algorithm that can replace the encounter between two truly human fragilities.