Deepfake industry on TikTok: how fake videos of famous doctors sell dubious “miracle cures”

772     0
Deepfake industry on TikTok: how fake videos of famous doctors sell dubious “miracle cures”
Deepfake industry on TikTok: how fake videos of famous doctors sell dubious “miracle cures”

Hundreds of renowned doctors are being featured in deepfakes on social media to spread health misinformation and deceive customers into purchasing products, LBC can reveal.

Doctors interviewed by LBC warned the AI tech is ’praying on people’s vulnerabilities’ and could lead to serious health complications if they follow the fake advice. 

They warned videos have been viewed by millions of people who have ’no idea’ the clips are fake, warning that the technology is becoming ’better and better’ and harder to spot.

They warned there is currently no way for these wellness brands and scammers to be punished as legislation hasn’t caught up with the technology yet.

In one fake video posted by the TikTok account gethealthytok, the late Dr Michael Mosley can be seen speaking to a conference warning of foods to steer clear of if you want to avoid dying before you’re 40, including coffee.

Instead of coffee, the fake Dr Mosley suggests an alternative - Himalayan shilajit.

 
 
 
Переглянути цей допис в Instagram

Допис, поширений Dr Idrees Mughal (MBBS, MRes, DipIBLM) (@dr_idz)

He even suggested a specific brand selling the product - Wellness Nest - which has a US-based address.

That same company is promoted repeatedly by the same TikTok account - using deepfakes of dozens of other celebrity doctors, including Zoe’s Professor Tim Spector.

The account had more than 12,000 followers - and even had a link back to the product and the Wellness Nest website in its bio.

Dr Mosley died last summer after disappearing on the Greek island of Symi during a holiday with his wife.

Dr Michael Mosley Attends 14th World Congress On Inflammation qhiukiqrihdinv

Dr Michael Mosley went missing last June during a walk in a mountainous area of Symi, Greece. Picture: Getty

’Indistinguishable’

Dr Matthew Narga, a naturopathic doctor who debunks health misinformation online, is one of those doctors.

 

He told LBC how he was horrified to learn his identify was stolen in a clip where he appeared to claim he was a cardiologist.

In the clip, an AI version of Dr Nagra advocated for Himalayan shilajit as a solution for avoiding heart attacks.

"I, knowing myself, could tell that it was AI just by the way that it was speaking as it was different to my normal cadence and everything.

"But there were people messaging me or who had commented on the video who legitimately thought it was me," he told LBC.

"That’s what makes it scary. AI is getting to a point where in the next, two, three, four, five years, it might be indistinguishable. And that can be really bad."

But Dr Nagra claimed when he tried to report the video to TikTok, he was told it didn’t violate against their terms and conditions.

That is despite TikTok’s website stating that it does not allow "AI-generated content that shows.. fake authoritative sources or crisis events, or falsely shows public figures in certain contexts."

Dr Nagra said: "It’s ridiculous. There’s not even much we can do about it from that standpoint. It means it’s on the consumer to check or verify whether something is real."

The gethealhtytok account on TikTok has hundreds of deep fakes promoting an unproven wellness product.

The gethealhtytok account on TikTok has hundreds of deep fakes promoting an unproven wellness product. Picture: TikTok

’Extremely scary’

He said social media users should check whether health and wellness influencers cite evidence to back up their claims or whether they are marketing a product to determine whether it might be AI.

Dr Idrees Mughal is a medical doctor with nearly 2million followers on TikTok and 700,000 followers on Instagram, making videos debunking inaccurate health claims made by other wellness influencers online.

He is the one who first alerted to Dr Nagra to the deep fake of him that was circulating on social media.

Dr Mughal told LBC it was ’extremely scary’ to discover there were also deep fakes him on the internet, including one video where he seemingly promotes a product that misleadingly alleges to cure back pain.

"It’s becoming more and more scary now because it’s becoming indistinguishable from, from actual real clips," he told LBC.

He warned there is no proper way of reprimanding those creating these deep fakes as legislation in the UK hasn’t caught up with the technology.

"There’s no way for this company at the moment, the one that’s got millions and millions of followers, that is making likely hundreds of thousands, if not millions of dollars pushing shilajit.

"There is no way for them to have any kind of recall. There’s no way for them to be held accountable because the legal ramifications aren’t in place yet. The legislation hasn’t caught up to how quick AI is advancing."

Even if the law did punish those who use deep fakes on social media, Dr Mughal warned this would not do much to stop the problem.

"The issue is that these wellness brands can be set up in literally two hours.

"Imagine you ban this brand that I called out the other the other week. That same product can just be pushed on a whole separate account and they can repost all of these same videos literally within the space of 30 to 60 minutes.

"It’s not difficult. And they could just close down the existing company and someone in a different person’s name can just set up the same thing and just do it all over again."

But he said there are still some tell-tale signs to look out for when trying to determine whether a video on social media is actually real when it comes to health information.

"When you see a deep fake of an evidence based creator, you then think twice. Okay, is this a genuine clip on this page or is it not?

"If it’s not on the original creator’s page, then just assume it’s not real," he said.

He added that the public should be wary of "absolute statements" where doctors in clips "claim something can fix something entirely that’s just false" by using "sensationalised hyperbolic claims", not citing their evidence and pushing a product.

Following LBC’s investigation, TikTok has removed the gethealthytok account from its platform.

A spokesperson told LBC: "Our Community Guidelines make clear we do not allow account behaviour that may spam or mislead our community, including spam or impersonation accounts, and we do not allow attempts to defraud or scam members of our community.

"Our Community Guidelines also prohibit health misinformation that may cause significant harm to individuals or society, regardless of intent and we remove this content from the platform when we find it.

"That includes inaccurate medical advice that discourages people from getting appropriate medical care for a life-threatening disease or other misinformation that may cause negative health effects on an individual’s life

"For videos that we removed for violating our policies on misinformation in Q1 2025, we removed more than 99.1% proactively (before anyone filed a report on it)."

 

James Smith

Print page

Comments:

comments powered by Disqus