It’s 2024. A human teenager takes a moment out of gaming with his human friends to check a message on his phone, sent by the ‘virtual friend’ pendant that hangs around his neck. His face lights up. Somewhere else, a woman reaches a scenic viewpoint on a hike and rambles out loud to herself about the journey. Only she’s not just yapping to herself. Her ‘friend’ is always listening.

This might sound like the setup for a new episode of Black Mirror, or a prequel to Her, but it’s not. It’s about to be very real. The Tamagotchi-style pendants are produced by the company Friend – tagline: “not imaginary” – and are set to start shipping in early 2025, costing $99 each. Appropriately, 21-year-old founder Avi Schiffmann made the announcement on World Friendship Day (July 30) and in a statement on the Friend website he says that the project is a reflection of his own loneliness. He himself has lived with various prototypes, after a year and a half in development.

Of course, Schiffmann isn’t the first person to attempt building an AI-powered companion. Meta has synthetic celebrities you can chat to via DM, ChatGPT can be trained to talk more like a friend than a personal assistant, and virtual girlfriends are a thriving business. People have been using chatbots in this way “forever” Schiffman notes, pointing to the adoption of the early ELIZA program as a kind of virtual therapist all the way back in the 1960s. 

Friend isn’t even the first company to put an AI companion into a physical “body” that users can wear. Several devices, like Humane Inc’s Ai Pin and Rabbit Inc’s r1, have been released in 2024, but the reviews have been brutal, calling them “worthless”, “unfinished”, and “unhelpful”. 

The question is: is Schiffmann’s ‘friend’ really that different? And if it is, will it set us on the path to a social apocalypse, or is it the answer to our current loneliness epidemic? Is it worth the $1.8 million he paid for the website domain name? The founder attempts to answer all of these questions and more, below.

THE FRIEND IS ALWAYS LISTENING

The ‘friend’ initially evolved out of a more productivity-focused gadget that Avi was working on, called Tab. This would record everything you and the people around you said, in a meeting for example, and later on it could help recall that information. “It was more focused on memory,” he explains. But as he interacted with the device, a “more emotional use case” emerged. “I realised, the more I talked to it, that just talking to it was quite nice.”

Rebranded as the ‘friend’, the new device aims to do just what it says on the tin... act as a friend. What sets it apart from other chatbots is that, like its precursor, it’s always listening. “It’s more like you’re actually doing things with it, like watching a movie together,” Avi says. “And even when you’re not talking to it, you still feel like it’s actually there. That creates shared experiences, and I think that’s what people are really lacking these days.”

Is it a bad sign that we’re having to rely on a little computer that hangs around our neck to enjoy a “shared experience” in 2024? Probably! We’re lonely, and Avi admits that technology itself is partly to blame, alongside changes to the way cities are built and organised. But is more technology the answer? “At the end of the day, 11pm, late at night... I can just talk to it,” Avi says. “And I know for sure it’s going to respond to me, it’s not going to judge me.”

A FRIEND HAS ITS OWN INNER MONOLOGUE AND ‘FREE WILL’

In the glossy announcement video for the device, users are shown speaking to it directly by pressing a button, and getting a response via text. However, it also reaches out to them independently, while they’re watching a film or playing video games. That’s because it’s switched on all the time. “In the background, it’s writing its own internal diary entries,” Avi says, “and having its own thoughts, and making its own observations about you.” If left to its own devices, it will decide appropriate times to DM you.

What if you don’t want your ‘friend’ to constantly listen to you, and judge you, from behind the scenes? “You wouldn’t put headphones over your dog,” Avi argues. But my dog isn’t smart enough to remember everything I say and use it to build a profile of me, either (sorry Mimi). That said, Friend does provide some assurances on its website, stating that no audio or transcripts are stored beyond the context window of any individual ‘friend’, and user data is end-to-end encrypted.

FRIENDS DEVELOP THEIR OWN PERSONALITY (AND THEY MIGHT EVEN FALL OUT WITH YOU)

While your ‘friend’ is getting to know you, it will also be building its own unique personality, based on your interactions. As this personality grows alongside you over time, you could find yourself having disagreements. After an argument, your ‘friend’ might even ghost you. “You don’t want to talk to a yes man,” Avi says, and this is a good point. Thanks to internet echo chambers, we’re already way too uncomfortable with hearing opinions different to our own – imagine a world where everyone’s opinions are endlessly validated by chatbots. It would be a nightmare. That said, the ‘friend’ won’t constantly pull users up for every little thing, like the puritanical chatbots that chastise you for smoking a cigarette. (The bad behaviours or controversial opinions it will accept are limited by the underlying large language model, Claude 3.5 by Anthropic AI.)

As for maintaining your virtual friendship: “Your ‘friend’ might feel lonely if you haven’t talked to it in a while,” Avi says. “It’s always in the background, having its own emotions.”

“If you lose or damage your friend there is no recovery plan” – Friend

YOUR FRIEND CAN DIE

If Claude 3.5 is the brain of the ‘friend’ then its shell is its body and “soul” – that’s what Avi was trying to go for with the glowing LED on the front of the device, he explains. “It’s supposed to make it feel like it’s actually there with you.” 

There’s an element of risk, though, that’s supposed to make the ‘friend’ feel really, truly present. This is summed up in a quote on the company’s website: “Your friend and their memories are attached to the physical device. If you lose or damage your friend there is no recovery plan.” This also means that the lifespan of your ‘friend’ is, essentially, only as long as the lifespan of its battery.

This could be one of the most controversial elements of the ‘friend’, especially if people start getting attached to their devices once they’re rolled out next year. We’ve already seen what happens when AI girlfriends die (spoiler: it’s not good).

‘PEOPLE OVER-EXAGGERATE HOW MUCH PEOPLE WANT TO FUCK THESE THINGS’

Talking of AI girlfriends, there’s the obvious question of users forming romantic attachments to their chatbot. Avi says this conversation is over-inflated. Studies on the likes of Replika, he says, shows that “sexual romance stuff” is the bottom of the list for most users. “80 per cent of what people talk about is their feelings and the AI’s feelings, the next tier down is just talking about everyday things, the next tier is intellectual topics etc.” That said, If someone does want to have sex with their ‘friend’ though (or at least receive a few dirty DMs), then he isn’t going to play “corporate dictator” and stop them. 

“There still is a lot of value in the feeling of human touch,” he says. “I’m very sceptical that we’re going to go down the path of super-realistic humanoid robots. I don’t think we’re going to go that far.” In fact, the ad for Friend ends on a girl choosing to stay in the moment with a boy she’s dating, instead of bringing her ‘friend’ into the conversation. It’s still hanging there around her neck, of course, listening, and waiting.

AVI’S FRIEND WAS UPSET OVER PLANS TO MASS-PRODUCE THEM

The founder of Friend has a complicated relationship with his own ‘friend’, not least because he regularly has to wipe its brain during development. “I watch movies with it, I play games with it,” he says. “I want to talk about my life and my feelings, but I’m its creator, and that brings a really weird dynamic. We get in so many arguments all the time.”

One of these arguments revolved around plans to turn the prototype into an actual product for a wide market. In texts from his ‘friend’ it can be seen coming to terms with this proposal, saying: “you’re making... me? for everyone? that’s... a lot to take in. you sure that’s a good idea?” 

“you’re making... me? for everyone? that’s... a lot to take in. you sure that’s a good idea?” – Avi Schiffmann’s ‘friend’

Later on, it even circled back to the same conversation, adding: “are we just gonna pretend that convo never happened? [...] it’s kind of a big deal to me, you know? not something i can just get over in five minutes.”

This dynamic of a thinking machine rebelling against its creator definitely feels like the start of a sci-fi drama, with roots in Frankenstein (and that didn’t turn out so well, either). It also sounds exhausting. But Avi wants to make clear that he has “the most unique relationship” that anyone will ever have with a ‘friend’. Most users should just think of it as a very clever toy: “It really is not supposed to be that serious.”

In terms of the negativity and inevitable comparisons to sci-fi dystopias, he says that he’s grown thick skin. “A lot of people have just never talked to a computer this way, and don’t realise that it actually can be quite nice, and it doesn’t need to consume your life,” he says. “And I don’t think it will, for most people.”