News

Fraudsters use AI Deepfake of Sudhir Ruparelia to Push Suspicious investment Scheme 

Share

A sophisticated AI-generated deepfake video fraudulently depicting business magnate Dr. Sudhir Ruparelia endorsing a questionable financial platform known as “OredexiaMarket” has sparked concern, after the tycoon firmly distanced himself from the scheme and warned the public against a growing wave of digitally manipulated scams. 

The video, which has been circulating on multiple social media platforms, features a highly convincing synthetic voice closely resembling Sudhir’s, urging viewers to “take advantage of this opportunity” and register for what is presented as a revolutionary financial platform designed to “help families become wealthier.” 

To make the deception appear credible, the clip is packaged with graphics and visual formatting resembling a legitimate news report, creating the false impression that the platform has been independently verified and publicly endorsed. 

Sudhir has since categorically denied any involvement in the alleged platform. 

“This is completely fake. I have not launched any such app, and I am not involved in anything called OredexiaMarket,” Sudhir said.
“People must be extremely cautious. AI is now being misused to deceive the public.” 

He warned that rapidly advancing artificial intelligence tools are increasingly being weaponized by fraudsters to clone voices, fabricate endorsements, and exploit the credibility of well-known public figures for financial gain. 

In the manipulated footage, the cloned voice claims: 

“Due to the financial crisis around the world, we have created a new platform to help families become wealthier,” 

before encouraging users to sign up and participate language commonly associated with high-risk online investment scams and deceptive financial marketing operations. 

Cybersecurity and digital fraud observers say such deepfake-enabled scams are becoming more common, particularly as criminals exploit the public trust attached to high-profile business leaders, celebrities, and politicians. 

The trend is increasingly visible across Africa and other emerging digital markets, where fraudsters combine synthetic media, emotional persuasion, and economic anxiety to lure unsuspecting victims into fake schemes. 

The latest incident involving Sudhir highlights the rising danger posed by AI-powered impersonation fraud, especially in an era where manipulated audio and video can be made to appear almost indistinguishable from reality. 

Members of the public have been urged to ignore the circulating video, avoid engaging with the advertised platform, and verify any financial opportunity through official and trusted communication channels before committing money or personal information. 

Share

Staff writer at Lira City Post.

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay Informed

Get the latest news delivered to your inbox every morning.