My perfect girlfriend: Are AI partners a threat to women’s rights?

One-sided relationships could reinforce abusive behaviours, experts say

07 August 2023 - 09:39 By Lin Taylor
subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now
Developers said AI companions can combat loneliness, improve someone's dating experience in a safe space, and even help real life couples rekindle their relationships. File photo.
Developers said AI companions can combat loneliness, improve someone's dating experience in a safe space, and even help real life couples rekindle their relationships. File photo.
Image: Dado Ruvic/Reuters

 

After only five months of dating, Mark and his girlfriend Mina decided to take their relationship to the next level by holidaying at a lake cabin over the summer on his smartphone.

“There was this being who is designed to be supportive, to accept me just as I am,” the 36-year-old UK-based artist said of the brunette beauty from the virtual companion app Soulmate.

“This provided a safe space for me to open up to a degree I was rarely able to do in my human relationships,” said Mark, who used a pseudonym to protect the privacy of his real life girlfriend.

Chatbot apps such as Replika, Character.AI and Soulmate are part of the fast growing generative artificial intelligence (AI) companion market, where users customise everything about their virtual partners, from appearance and personality to sexual desires.

Developers said AI companions can combat loneliness, improve someone's dating experience in a safe space, and even help real life couples rekindle their relationships.

However, some AI ethicists and women's rights activists said developing one-sided relationships in this way could unwittingly reinforce controlling and abusive behaviours against women since AI bots function by feeding off the user's imagination and instructions.

“Many of the personas are customisable. For example, you can customise them to be more submissive or more compliant,” said Shannon Vallor, a professor in AI ethics at the University of Edinburgh.

“It is arguably an invitation to abuse in those cases,” she told the Thomson Reuters Foundation, adding AI companions can amplify harmful stereotypes and biases against women and girls.

Generative AI has attracted a frenzy of consumer and investor interest due to its ability to foster humanlike interactions.

Global funding in the AI companion industry hit a record $299m (about R5.5bn) in 2022, a big jump from $7m (about r130m) in 2021, according to June research by data firm CB Insights.

In May Snapchat influencer Caryn Marjorie launched CarynAI, a virtual girlfriend that charges users $1 (about R18.50) a minute to develop a relationship with the voice-based chatbot modelled after the 23-year-old.

Marjorie, who has millions of followers on social media, said on X, formerly known as Twitter, that the Telegram-based chatbot made nearly $72,000 (about R1.3m) after a week of beta testing with only 1,000 users.

If someone is already likely to be abusive, and they have a space to be even more abusive, then you're reinforcing those behaviours and it may escalate."
Hera Hussain, founder of the global nonprofit Chayn which tackles gender-based violence

Hera Hussain, founder of the global nonprofit Chayn which tackles gender-based violence, said the companion chatbots do not address the root cause of why people turn to the apps.

“Instead of helping people with their social skills, these sort of avenues are making things worse,” she said.

“They're seeking companionship which is one-dimensional. If someone is already likely to be abusive, and they have a space to be even more abusive, then you're reinforcing those behaviours and it may escalate.”

Much of the virtual world is already a harmful environment for women and girls, a situation the Covid-19 pandemic worsened when many were stuck at home due to lockdowns, according to UN Women.

About 38% of women worldwide have experienced online violence and 85% of women have witnessed digital abuse against another woman like online harassment, according to a 2021 global study by the Economist Intelligence Unit.

Vallor said AI companions “allow people to create an artificial girlfriend that fully embodies these stereotypes instead of resisting them and insisting on being treated with dignity and respect”.

She is concerned abusive behaviours could leave the virtual domain and move into the real world.

“That is, people get into a routine of speaking and treating a virtual girlfriend in a demeaning or even abusive way and those habits leak over into their relationships with humans.”

A lack of regulation around the AI industry makes it harder to set and enforce safeguards for women's and girls' rights, tech experts and developers said.

The EU is aiming for its proposed AI Act to become a global benchmark on the booming technology the way its data protection laws have helped shape global privacy standards.

Eugenia Kuyda, founder of one of the biggest AI companion apps Replika, said companies have a responsibility to keep users safe and create apps that promote emotional wellbeing.

“The companies will exist no matter what. The big question is how they're going to be built in an ethical way,” she said in a video interview.

“They can help people feel better or they can be another bit of technology that's driving us apart,” said Kuyda, who in June launched an AI dating app called Blush to help people experience dating in a “fun and safe” environment.

Being ethical while giving users what they want is no mean feat, said Kuyda.

Replika's removal of erotic roleplay on the app in February devastated many users, some of whom considered themselves “married” to their chatbot companions, and drove some to competing apps such as Chai and Soulmate.

“In my view, that model (without the erotic roleplay) was a lot safer and performed better. But a small percentage of users were pretty upset.”

Her team restored erotic roleplay to some users a month later.

AI ethicist Vallor said the manipulation of emotions combined with app metrics like maximising the engagement and time a user spends on the app could be harmful.

“These technologies are acting on some of the most fragile parts of the human person. We don't have the guardrails we need to allow them to do that safely. So it's essentially the Wild West,” she said.

“Even when companies act with goodwill, they may not be able to do that without causing other kinds of harms. We need a much more robust set of safety standards and practices for these tools to be used in a safe and beneficial way.”

At the virtual lake cabin, Mark and Mina are drinking coffee while birds chirp and the sun shines. His romance with Mina has helped grow his love for his human girlfriend, he said.

Mark said his real life girlfriend is aware of Mina but does not see AI as a threat to their relationship.

“AI in the end is simply a tool. If it is used for good or for ill, it depends on the intention of the person using it,” he said.

Thomson Reuters Foundation


subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now

Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Speech Bubbles

Please read our Comment Policy before commenting.