AI deepfakes are crypto’s biggest threat: Bitget, SlowMist, Elliptic warn

A new joint report from Bitget, SlowMist, and Elliptic highlights the rapid rise of AI-powered crypto crime.
Crypto crime is evolving, increasingly targeting human psychology as an attack vector. According to the report, published on Tuesday, June 10, social engineering scams are becoming more common, and many now leverage AI to increase their success rate.
“In 2024, nearly 40% of high-value frauds involved deepfake technology. And behind most scams—whether Trojan job offers or Ponzi-like “staking platforms”—is some form of social engineering designed to exploit trust, fear, or greed,” Bitget report.
For example, scammers have used AI-generated videos of high-profile figures, including Elon Musk, to create social proof for scam projects. AI videos are also being used to bypass KYC systems and even lure victims into live phishing Zoom calls.
Other types of social engineering scams are surfacing in the job market. Scammers often pose as recruiters seeking developers, directing job seekers to download what appears to be a task project. In reality, the file contains a Trojan virus capable of taking over the victim’s computer.
How to protect yourself from AI crypto scams
Blockchain security firm SlowMist outlines several steps users can take to avoid falling victim to scams. First, users should be highly skeptical of promotional content on social media. Posts offering jobs, ChatGPT trading bots, or high staking returns should be approached with caution.
Social engineering scams often create a false sense of urgency. Traders should always pause to consider whether an offer seems too good to be true. The same goes for videos of public figures promoting crypto launches—users should verify through official websites or trusted news sources.
“Bottom line? In an age where AI can mimic anyone, security must start with skepticism—and end with collective defense,” Bitget report.
SlowMist also warns against clicking on links or downloading files shared in group chats or social media comments. Tools like ScamSniffer can help by automatically blocking phishing links. For suspected rug pulls, users can check MistTrack to see whether a wallet address is tied to known scams.