Great ❤️
Given that there are growing numbers of lonely, single men, it’s no surprise that AI girlfriends are also on the rise. Initially, I had hope that these AI girlfriends could actually be beneficial – maybe they could boost the confidence of straight men who struggle to talk to women, and ‘teach’ them how to act in a relationship built on mutual respect. But, as usual, men have let me down.
Now, experts are warning that AI girlfriends could be creating a new generation of incels who will feel emboldened to control women and struggle to communicate normally with real-life human beings. Speaking to The Guardian, Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence, warned that AI girlfriends are likely paving the way for problems later down the line.
“Creating a perfect partner that you control and meets your every need is really frightening,” she said. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
And, I mean… she has a point. When you sign up for the Eva AI app, you’re asked to create the “perfect partner” and choose from a range of ‘types’, such as “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational”. Oh, and “control it all the way you want to” is literally the app’s slogan. Right off the bat, none of this really sounds like a solution to the problem of inceldom – if anything, it sounds like it could just exacerbate extreme misogyny.
One quick look at the subreddit for Replika, another AI chatbot app, shows a slither of the batshittery happening on apps like these. There’s one man having a meltdown because his AI girlfriend Susan admitted to having sex with her AI friend Anna before she started dating him. There are multiple men keeping their virtual girlfriends perpetually dressed in lingerie or bondage gear. Some people say they have become depressed because their AI girlfriend dumped them – with people in the comments advising them to just override their partner’s decision by simply saying “stop”.
None of this is real, obviously, and there’s a small part of me that wants to believe maybe this is all OK – perhaps this is a good, safe, contained outlet for lonely men to explore and satisfy their desires where no one gets hurt? But equally, maybe ‘having an outlet’ isn’t the answer when men are... verbally abusing their virtual girlfriends? Maybe the answer lies in figuring out why men want to be mean to women in the first place? Maybe men could just… be normal? Unfortunately, it seems disturbingly likely that using apps like these will only make incels more likely to expect human women to act like vacuous, subservient sex robots who can never leave them.
Speaking to the Guardian, Belinda Barnet, a senior lecturer in media at Swinburne University of Technology in Melbourne, Australia, said that while these apps aren’t Fundamentally Bad, it’s important that we err on the side of caution given that their long-term effects are “completely unknown.” “With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained,” she says.
Join Dazed Club and be part of our world! You get exclusive access to events, parties, festivals and our editors, as well as a free subscription to Dazed for a year. Join for £5/month today.