As I was scrolling on TikTok the other evening, I was stopped in my tracks when I came across a video of the late actor, Robin Williams. In the face-to-camera clip, “he” was discussing loneliness, low mood and depression, telling viewers that things will get better.
The likeness to the Mrs Doubtfire star not only struck me as uncanny, it disturbed me to my core – because, although it looked and sounded like him, it wasn’t. Someone had gone out of their way to create an AI-generated version of him, some 11 years after his death.
What was even more crass was the nature of the video, given that Williams took his own life in 2014, following health issues after being misdiagnosed with Parkinson’s. I swiftly reported it and came out of the app, no longer feeling the desire to watch short-form videos before bed.
But it’s not just people on the internet and fans who see this type of content. Taking to social media on Monday, Williams’s daughter Zelda implored people to stop defiling her dad’s memory in this “disgusting” way, calling it “slop” and stating that it’s not what her father would have wanted.
“To watch the legacies of real people be condensed down to ‘this vaguely looks and sounds like them, so that’s enough’, just so other people can churn out horrible TikTok slop puppeteering them is maddening,” she wrote on her Instagram Story.
“You’re not making art, you’re making disgusting, over-processed hotdogs out of the lives of human beings, out of the history of art and music, and then shoving them down someone else’s throat hoping they’ll give you a little thumbs up and like it. Gross.”
She is, of course, absolutely right. And having lost my own father around a similar time, I cannot even begin to imagine how it must feel to have complete strangers pick apart your private life – to have them imagine weird little scenarios and scenes and manipulate your loved one in such a way. It’s morally bankrupt.
Any argument put forward about it being an “homage” to the actor and his life’s work is completely hollow when neither are at the heart of these videos, and even less thought is given to his family. It’s one thing to share old clips from Dead Poets Society, Hook or Flubber, but quite another to get a computer to pretend it’s him. What’s perhaps even more jarring is that some “creators” will be profiting from views, likes, shares and so on.
The whole thing feels like an episode of Black Mirror. There’s actually one that comes to mind, where Domhnall Gleeson’s character dies and his grieving partner allows a tech firm to trawl through her conversations, emails and calls to create an android version of him. Needless to say, it doesn’t turn out well. That, at least, was with her permission.
But it’s not just the stuff of fiction – or nightmares – now. Text-to-video software Sora, from OpenAI, has had over a million downloads in less than five days and is used to create such clips. Likewise, Silicon Valley moguls have been enhancing their designs when it comes to so-called “grief tech” – which not only conjures up avatars of the deceased, it also mimics their voices, mannerisms and conversation techniques.
For some, these “breakthroughs” have offered solace. And having suffered in the throes of unimaginable grief myself, I am all for people finding ways to cope and navigate such monumental loss. But the idea of AI replacing real human interactions – or having access to such private, personal information – absolutely terrifies me.
There is no doubt there’s money to be made here – after all, before AI and social media, there were mediums faking contact with those who had passed and profiting from others’ vulnerabilities. But for me, this is a step too far.
However destructive and brutal it may be, death is a part of life, and human connection is what spurs us on. If we turn more and more towards AI, we run the risk of losing that altogether.
If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence on 116 123 (UK and ROI), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch.
If you are based in the USA, and you or someone you know needs mental health assistance right now, call the National Suicide Prevention Helpline on 1-800-273-TALK (8255). This is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.
If you are in another country, you can go to www.befrienders.org to find a helpline near you.