It is a truth universally acknowledged that tech companies will do literally anything for money. But now, Amazon has sunk to new depths by choosing to capitalise on people’s grief.
Rohit Prasad, Amazon senior vice president and head scientist for virtual assistant Alexa, revealed on Wednesday that Alexa will soon be able to mimic the voices of the dead.
Prasad, speaking at the company’s re:MARS conference in Las Vegas this week, described a potential new feature which can synthesise short audio clips into longer speech. In a video segment presented at the event, a child asks “Alexa, can grandma finish reading me the Wizard of Oz?”, a command which she obliges by swapping into ‘grandma’s’ voice and reading in an eerily realistic manner.
Prasad said the technology sought to “make the memories last” after “so many of us have lost someone we love” during the pandemic.
“This required inventions where we had to learn to produce a high-quality voice with less than a minute of recording versus hours of recording in the studio,” Prasad said. “The way we made it happen is by framing the problem as a voice conversion task and not a speech generation path. We are unquestionably living in the golden era of AI, where our dreams and science fiction are becoming a reality.”
Obviously, this throws up a lot of rather urgent ethical questions. Is it morally OK to capitalise on someone’s grief and market them something which promises to alleviate their pain when they’re at their most desperate? Plus, we already have huge issues with deepfakes – what’s to stop someone using the technology to fabricate false quotes from someone who’s still living? While it might be pretty funny to get your Alexa to make it sound like Paul Mescal is saying “Now playing on Amazon Music, Despacito“, there may be people out there who would use this technology for more nefarious purposes.
Also, how on earth do you explain to a grieving child that no, grandma really is dead, it’s just advanced artificial intelligence technology designed to sound like grandma? Surely it would be kinder and healthier to let a child – or anyone – grieve properly, rather than cling on to an AI rendering of their late loved ones? And, anyway, what if grandma doesn’t want to become a disembodied voice? What about the rights of the dead?
Many on social media have likened the new feature to Black Mirror – specifically the ‘Be Right Back’ episode, where Hayley Atwell’s character becomes obsessed with an AI rendering of her late boyfriend. “Black Mirror is supposed to be a warning, not a pitch,“ one user wrote.