Dame Esther Rantzen’s family have released photos of the star celebrating her 85th birthday — after vile trolls spread “repulsive” fake images made with artificial intelligence. The Childline founder has not been seen in public for two years as she privately battles terminal lung cancer. But her daughter, broadcaster Rebecca Wilcox, said she had consented to having real images released to counter the “abhorrent” misinformation being spread online.
A number of false photographs posted on social media showed an emaciated woman resembling Dame Esther lying in a hospital bed, seemingly close to death. Rebecca, 45, said: “She doesn’t want people to think her cancer is this awful, or to create fear for other people who may be in the same situation.
“She’s eating pavlova and celebrating her birthday with her family. She’s not got the energy she used to, but she’s certainly not on her death bed with a drip or in a coma.
“How dare you suggest that? Think about her grandchildren coming across that and not knowing.”
One fake image showed an anguished woman in a hospital bed, receiving oxygen through a nasal cannula with a drip beside her. Another that more closely resembled Dame Esther claimed to show the star lying in a hospital bed with a nurse watching over her.
Appearing on BBC One’s Morning Live on Thursday, Rebecca contrasted the images with real photos of her mother beaming in a yellow dress and blazer as she celebrated her 85th birthday on Sunday.
One photo showed Rebecca holding a pavlova beside her mum, while another captured the veteran broadcaster wearing a straw hat as she took a stroll around her garden in May.
Dame Esther has spoken to the media occasionally through radio or audio appearances, particularly while supporting efforts to legalise assisted dying. But her family have closely guarded her privacy, releasing very few images and declining to share details of her treatment.
Seeing the “repugnant, repulsive, abhorrent” fake images online felt like an “exposure and an invasion”, Rebecca said.
She added: “It’s a horrible reason to break this really careful privacy, we have been so guarded of her information and her image. It’s not that she looks any different, she just wants to feel private and safe — that’s why she’s not been in the public eye visually.
“Anyone with a terminal diagnosis or who knows someone with one would know that you just need a little bit of privacy. You want to feel comfortable and part of being comfortable is not being exposed.”
Rebecca, who regularly appears on the programme as a consumer champion, told Morning Live hosts Gethin Jones and Kimberley Walsh that her mum “looks incredible” despite her health battle.
She said she had warned her brother, Joshua Wilcox, to “tell his kids that these pictures are out there and they’re not real, that’s not what she looks like”.
Some AI image generation platforms allow users to upload photos of real people so the technology can learn their likeness and create fake ones.
Dame Esther’s experience is part of a much wider problem of fake images and misinformation circulating online, Rebecca added.
She urged the people behind the images to stop and “think about the fact that the people at the centre of these stories are alive and have families — and are real”.
“There are all sorts of things going on with this diagnosis, I don’t want mum to have to think about this.
“I don’t want her friends, who haven’t seen her in the flesh for months or even years because she’s so secluded now, to think that’s what she looks like because that’s horrible for them.”
Morning Live’s scam and crime expert, Rav Wilding, pointed out a number of details in the images that showed they were fake.
These included numbers that appeared to have been artificially added to the IV drip bag in one image, and a button in the wrong place on a machine. Rav said: “That’s because AI struggles with the finer details within an image it’s creating.”
Rav also highlighted how the bed sheet in the same image appeared to blend into the wall. He added: “It’s worth looking in the background — not just the focus on where your attention would ordinarily be drawn, the subject — but look around there to see these subtle clues where AI is still struggling to create those.”
Describing the proliferation of online misinformation, Rav said a recent report suggested 52% of people primarily get their news from social media.
He added: “So most people are going to believe what they see as being correct and there is so much misinformation out there. Some of the fake images that we’ve found are not all illegal and not all going out to try and scam someone.
“There are lots of things out there that are not to be trusted online. It’s really just a reminder — it doesn’t matter what you see pop up, you need to do some checks to make sure it is something that you can trust.”
Other examples of deceptive images made with AI include posters for nonexistent sequels to popular films, and fake photos used to sell cheap products.
Rav urged people to act if they suspect an image is fake. He said: “We need to report these things to get them taken down. Every social media site will have its own reporting platform.”
Rebecca said she had reported the image of her mother “the moment I saw it”. She added: “I’ve not heard anything and they haven’t been taken down.
“I just feel for everybody else that’s on the website too. There are so many other people this has happened to. We just need to report every single one.”
It is not only celebrities who have been targeted. Earlier this year, an expert warned that artificial intelligence was being used to create “AI revenge porn” at an average cost of 27p per image.
Dr Elissa Redmiles, assistant professor at Georgetown University in the US, said this type of image-based abuse was rising, with commercial ecosystems facilitating the process — from advertising to payment platforms.
• Morning Live is made by BBC Studios Entertainment. Watch weekdays from 9.30am on BBC One and BBC iPlayer.