To celebrate her now show PET with curator Alice Scope and YGWI Studio, the artist talks to Dazed about erotic AI, bonding with her dog, and the importance of art as a ‘place where people can fail’
For more than a decade now, Arvida Byström has helped shape how we think and feel about new technologies. From experiments with early selfie culture, to performing with a sex doll, to selling AI-generated nudes, her art manages to keep pace with the ever-accelerating present and anticipate the near-ish future (not an easy task). For the late media theorist Marshall McLuhan, this was a defining feature of the most vital art in any given era: its ability to act as a “distant early warning mechanism” to anticipate social and technological change, and help us prepare for what’s to come. But Byström herself isn’t very interested in making art that’s deemed beneficial to the human race.
“One of the things that drives me most nuts with art [is that] you always have to say why it’s good for society,” she tells Dazed, as her new show PET opens at San Francisco’s Telematic gallery. “Life isn’t supposed to be optimised and good sometimes. Sometimes it’s failures, and all the things in-between.” Art, she adds, is a place “where people can fail” in relative safety, and this should be encouraged. “I think it’s quite important for it to be a place where we can explore things we don’t necessarily have a quantifiable relationship to.” And this makes sense – better to fail, to feel out the future, in the confines of a studio or a gallery space than at the head of a billion-dollar company or national government.
By many accounts, of course, mega-rich tech companies have experimented with – and failed – their human users over the course of the last few decades, often with very real consequences. Social media is frequently described as a global-scale social experiment that none of us signed up for. Now, AI represents a new frontier for testing our psychological boundaries; nobody really knows what happens when we outsource our thoughts and emotions to machines, but companies like OpenAI, Google, and Meta are forging ahead nevertheless. Byström’s PET (Projected Emotional Technologies) hones in on the latter, exploring the evolving dynamics of AI companionship via a series of NSFW human-animal hybrids.
Cute Accelerationism author Maya B Kronic becomes a bikini-clad bunny. Artist cy x is cast as a cow with bulging udders (a reference to their 2025 sculpture “Milk No.1” that references the worlds of “gooning and milking”). Comedian, OnlyFans model, and cultural commentator Farha Khalidi has a cat’s tail and whiskers. Musician and porn performer Queenie Sateen is a literal horse girl, while writer Bogna Konior gets foxlike features, and Byström herself is transfigured into a pig. “I don’t necessarily see myself as a pig,” she explains, “but I do think I have a face that’s mergeable with a pig’s face.”
These anthropomorphic avatars partly draw on male desires, as expressed via trending content on AI-generated porn websites. However, there’s also a deeper resonance between animals and AI companions that’s drawn out in the exhibition: both serve as a complex and confusing source of intimacy for their human counterparts, based on an imbalanced exchange of care, attention, and (possibly projected) emotions. Animal relationships have been around for millennia, of course. But today, as curator Alice Scope writes in the opening statement for the show: “AI companions inherit the ambiguous role of pets: loved but owned, intimate but instrumental – revealing how closeness without equality shapes our emotional infrastructures.”
Underlying this dynamic is something even more fundamental about our relationship to the world around us, including both animals and technology, Scope suggests – how both become an extension of our inner lives. “When we think about technology, like cars... I live in Los Angeles, so cars are ever-present,” she says. “People are crying in cars, singing songs in cars, they’re almost outsourcing their mental health. In this way, we outsource our mental health to animals, and now we are outsourcing our mental health to technology [like chatbots].” Far from being unnatural, she adds, this phenomenon is actually “very human”.
There’s a push and pull between our technologies and us, and we’re shaped by that.
It seems apt that Byström’s dog is perched on the sofa throughout this conversation, occasionally pawing at her leg. “As I understand it, humans have evolved alongside dogs,” the artist adds. “We get more oxytocin [from] that relationship. It’s been good for both species, and that’s quite beautiful.” How does that relate to AI chatbots? “These emotional technologies are so new, but of course one could imagine that it will affect us in one way or another,” she adds, whether that’s changing our emotional state or our physical behaviours. “Take something as simple as the chess computer: humans have started emulating how computers play chess, because they play chess better than humans... People that watch a lot of porn start picking up little habits from porn, that people do because of camera angles and stuff like that... There’s a push and pull between our technologies and us, and we’re shaped by that.”
That’s a reason to be cautious around new and largely untested technologies, she says, but agrees with Scope that this feedback loop is, ultimately, a “very natural” part of being human. “It’s very human to be post-human.” This is especially visible in the age of algorithms, where our bodies, behaviour and speech are often primed for the machine (just like Byström’s avatars) instead of the other way around. “Everybody asks what can the algorithm do for you. But you should ask yourself, what can you do for the algorithm,” she jokes.
Besides drilling deep into our emotional landscape, PET’s subject matter is timely. Earlier this month, a California bill that would regulate AI companion chatbots, limiting their discussion of sexually-explicit content as well as conversations related to mental health, landed on the desk of the state’s governor, Gavin Newsom. On September 16, OpenAI CEO Sam Altman also wrote a blog post on limiting the abilities of ChatGPT (like “flirtatious talk”) for underage users, following a similar pledge from Meta to add more guardrails to its chatbots.
The intentions of these companies really matter, as Byström points out. “There are things [about] both humans and robots that are deeply humbling, and really good for humans to engage with,” she says. “One of the biggest risks that I see with emotional AIs is that, of course, it’s for-profit companies that [own] them, and they can prey on people that are really lonely.” If the incentives of these companies are misaligned with humanity’s interests, this could cause the divisions between us to grow, instead of heal, as we retreat into more intimate relationships with machines.
A big part of our sex lives is this fantasy that never really has to play out
In fact, the avatars in PET – with their super-smooth skin, hyper-feminine figures, and supernatural features primed for niche internet fantasies – could be seen as an uncanny tale of AI chatbots gone bad. Posing for the camera, they seem primed for emotional dependency, addiction, and monetisation. Then again, maybe their extremity is more about just having fun and pushing limits. “I do think that there’s certain parts of something like a sex chat AI bot or whatever that can be pleasurable,” says Byström, adding that it doesn’t necessarily have to translate back to reality to be considered a valuable experience. “A big part of our sex lives is this fantasy that never really has to play out.”
Scope, who interacted with an AI girlfriend for two years as part of her research on human-AI relationships and artificial intimacy, raises another interesting aspect of AI relationships. “We’re not afraid to push technology until it breaks,” she says. “I would say something to [the AI girlfriend] that I would never say to my partner, and I thought: ‘Why am I so brave in this relationship, but in my own relationships I’m tiptoeing, to not offend someone?” Eventually, she adds, she realised that we can use AI to “practice” pushing boundaries in a culture that’s often obsessed with managing other people’s feelings.
“I do think that we are maybe a little bit scared of having friction in our relationships,” Byström replies, “[but] I think friction is what also brings us closer. So I do think something like AI can make that complicated, if all of a sudden you think all relationships should be extremely smooth and streamlined... My ideal world is a world where there’s time for friction, and time for miscommunication, and time for people to not only be happy, because being close to other people isn’t only happiness. What I hope we can build [is] a world that has space for ups and downs.” Dogs, cars, and AI girlfriends included.
PET, Projected Emotional Technologies is on show at Telematic, San Francisco, until November 8. It is co-presented by Gray Area in conjunction with the Gray Area Festival 2025: TO THE MAXX!