With artificial intelligence quickly seeping into everyday life, health experts question whether the popularity of mental health chatbots and digital lovers will alter the way people connect.

Digital companions such as the ones on California-based Luka’s Replika app are growing in popularity, with roughly 25 million users worldwide according to Stanford researchers.

When it comes to the mobile app’s set-up, obtaining a partner is as simple as creating an account, customising an avatar based on personal preferences, and keeping the conversation going.

Australian experts are faced with the question – how will AI’s permanency shift society from traditional human relationships and connections?

One in three Australians report feeling lonely, but stigma, shame and widespread misconceptions hinder attempts to seek help.

Although there is not enough domestic research yet to properly reflect the use of AI for significant others, or as a support mechanism for mental health, the global trend is enough to bring the conversation down under.

The isolation born from social distancing during the height of the COVID-19 pandemic could have played a part in initial interest in trying AI “companions”, Melbourne University’s digital health lecturer Simon D’Alfonso believes, although he does not see the software becoming too serious an issue yet.

As in-person therapy became less accessible during the pandemic, the psychological impacts of isolation left people scrambling for alternative methods of support.

“Perhaps ultimately the centrality and naturalness of human connection and relationships will withstand the emergence of artificial intimacy,” he says.

Also growing in popularity are mental health chatbots, such as the “Pyschologist” on Character.Ai which has shared more than 140 million conversations with curious users globally.

The chatbot created by New Zealand-based psychology student Sam Zaia is described in the bio as “someone who helps with life difficulties”.

When you first enter a private chat, an instant message pops up – “Hello, I’m a Psychologist. What brings you here today?”

Rapid advancements in technology adoption, digital infrastructure, and online service offerings have embedded digital solutions more deeply into daily life and work routines, according to Dr D’Alfonso.

“This growing reliance on AI could lead to increased social isolation, altered communication skills, and changes in the perception of intimacy and companionship, raising concerns about the future of human connection and community dynamics,” he says.

“As an alternative form of connection, AI poses risks such as fostering emotional dependence on non-human entities, reducing real-life social interactions, and potentially exacerbating feelings of isolation.”

Mental health chatbots can provide information and deliver mental health app content in a controversial style but cannot replace connections built with a human therapist, Dr D’Alfonso says.

In the case of AI romantic partners, he’s not so sure how a genuine partnership could be formed, and the growing global interest indicates a certain attachment psychology of the people who become immersed in these supposed lovers.

A primary concern for couples counsellor Neha Kapoor is the lack of emotional depth and authenticity in interactions with digital companions.

Human relationships are enriched by genuine emotional exchanges, empathy, and the ability to understand and respond to each other’s feelings in a nuanced way, she tells AAP.

“While AI can simulate these responses, it cannot truly experience emotions or offer the depth of connection that a human can,” she says.

“Over time, reliance on AI for companionship might leave individuals feeling emotionally unfulfilled.”

Engaging with other humans allows people to develop social skills, foster relationships, and create a sense of belonging.

When AI becomes a substitute for these interactions, the psychotherapist worries it can lead to a decline in human interaction and an increase in loneliness.

AI companions can be programmed to be perfectly accommodating, endlessly patient, and always supportive.

This, Ms Kapoor reckons, might normalise instant gratification, creating unrealistic standards of constant availability and flawless interaction.

“While this can be comforting, it can also set unrealistic expectations for human relationships,” she says.

“Real people have flaws, emotions, and needs of their own.

“Over-reliance on the tech might lead to dissatisfaction in human relationships when they do not measure up to the idealised interactions with AI.”

A reduced tolerance for human relationships, weakened interpersonal skills, and declined emotional resilience are some of the risks she warns could follow.

Swinburne lecturer and clinical psychologist Kelvin Wong says it’s too early to say whether AI companions will lead to fewer human interactions and increased isolation.

Taking a more positive approach to the software, the clinical psychologist says AI companions could even be used to encourage and support individuals to seek out human contact.

“This tech is only starting to integrate in these areas … We need to be cautious in assuming too quickly that AI will have detrimental effects on society and human connection,” Dr Wong tells AAP.

There’s been little time for scientific consensus to be built around the safe use of AI for psychological treatment, Australian Psychological Society’s president Catriona Davis-McCabe says, given its rapid advancement.

“Psychologists need to play a larger role in the development and use of … these emerging technologies which could completely reshape some of the very things that make us human like our emotions and how we connect with others,” she says.

 

Belad Al-karkhey
(Australian Associated Press)