John W. Hayes
John W. Hayes
Consumer and Business Expert

The internet has always been a popular feeding ground for scammers. With the advent of Artificial Intelligence (AI), scams are becoming increasingly sophisticated. In this new age, deepfake scams feel more real and threatening and are increasingly luring people to fall for them.

However, these recent scams target not only individuals. According to Deloitte's report, 25.9% of business executives say their organization has experienced one or more deepfake incidents in the past year. 

In this article, we explain what is deepfake and demonstrate how to avoid being scammed.

Key Insights

  • Deepfake scams are growing in sophistication, using AI-generated audio and video to impersonate trusted individuals and manipulate victims emotionally, financially, and reputationally.
  • Common deepfake examples include impersonation fraud, romance scams, financial/investment fraud, account takeovers, and deepfake blackmail.
  • According to CBS News, deepfake fraud cost the US economy upwards of $12 billion in 2023 and could rise to $40 billion by 2027.

Why Are Deepfake Scams So Easy to Fall For?

Deepfake scams are not just about technology; they are about human psychology. These highly sophisticated campaigns use AI to create audio and video that mimic familiar faces and voices, exploiting the natural human tendency to trust what looks and sounds familiar. 

Once scammers gain their 'marks' trust, they deploy tactics such as urgency as a call-to-action, forcing victims to bypass usual verification protocols and even the process of critical thinking, leading to them making quick and costly mistakes.

Most Common Types of Deepfake Scams

By mimicking someone the victim trusts, deepfake scammers can fool victims more easily than traditional email scams because they lower suspicion. This enables scammers to engage in more targeted and cynical attacks. Top recent scams involving deepfake technology include:

Impersonation attacks

Are you 100% sure the person at the end of that video or telephone call is genuine? Not confirming someone's identity proved a costly mistake for the British engineering firm Arup. According to reports in The Guardian, the company lost $25m when a staff member was persuaded to make several bank transfers after deepfake scammers digitally cloned the company’s chief financial officer on a video conference call.

Not all deepfake scams aim to steal money - some target reputations. In one BBC-reported case, a high school principal was falsely accused of making racist and anti-Semitic remarks through an AI voice scam created by the school’s athletic director, who was under investigation by the principal. Though local media identified the clips as deepfakes, much of the community dismissed this, and the recordings went viral. Even after the scammer was charged and jailed, the principal and school faced threats, highlighting how deepfakes can cause serious reputational and safety risks beyond financial harm.

Romance scams

Targeting vulnerable and lonely people, romance scams might be described as one of the most cynical uses of deepfake technology. Romance scammers will often play the long game, creating the appearance of a genuine relationship before eventually requesting money (or gift cards) to cover living expenses, unexpected bills, or flight tickets. And while scammers will always look to create "dream dates" - with the perfect looks, career, and back story to target their victims, some really go to town with their catfishing. 

In a recent story, straight out of a Hollywood script, one victim was recently duped into believing she was dating the movie star, Brad Pitt. After an 18-month “relationship”, during which time the scammer posing as Pitt “love bombed” the victim with poems, declarations of love, and even a marriage proposal, before eventually conning her out of $850,000 he purportedly needed for cancer treatment, telling her that all his bank account had been frozen by Pitt’s ex-wife, Angelina Jolie.

deepfake of Brad Pitt

Account takeover fraud

Account takeover fraud is another AI scam seemingly lifted from a Hollywood movie script. Scammers use AI-created audio or live video to bypass identity verification systems and hijack accounts. 

During live identity verification checks, scammers will use real-time voice cloning and even live video manipulation to answer questions or even simulate facial movements (e.g., "turn your head left"). Consumers are advised not to overshare audio or video on public platforms to reduce the risk of their identities being stolen and misused.

Financial scams and investment fraud

Scammers are increasingly using deepfake videos of celebrities and prominent public figures to persuade individuals to part with their cash in fraudulent investment schemes. The Financial Times recently reported a surge in this type of scam, including fake posts claiming to offer advice from its own veteran chief economics commentator, Martin Wolf.

But it's not just financial experts' advice that is being deepfaked. A growing number of celebrities, including Taylor Swift and Elon Musk, are being falsely represented by deepfake scammers.

In one such scam, an Australian teacher was scammed out of $130,000 after seeing a deepfake video of the musician Nick Cave talking about a fake investment opportunity.

New account opening fraud

Scammers are increasingly creating synthetic identities by combining stolen personal data (like social Security numbers) with fake names, addresses, and AI-generated photos or videos to bypass biometric verification systems and open new bank accounts or apply for credit. 

Using these fake identities, scammers will often build credit ratings with small purchases and on-time payments before maxing out loans and disappearing. Using real Social Security numbers, the scam looks legitimate to credit bureaus. At the same time, deepfake media helps it pass selfie and video-based KYC (Know Your Customer) checks without raising red flags.

Deepfake blackmailing

Deepfake blackmail (often referred to as sextortion) uses AI videos of victims in compromising situations. Scammers will often superimpose their victims’ faces onto explicit or otherwise embarrassing video footage and threaten to release the content publicly unless the victim pays a ransom. 

Scammers are also using fake news footage purporting to come from legitimate news channels like CNN, suggesting the targeted victim was involved in various crimes, including sexual assaults. The realistic footage prompts the victim to believe it would be widely believed if shared on social media, making them more likely to pay a ransom to destroy the footage.

Red Flags and Tools to Spot and Avoid a Deepfake Scam 

Typical red flags to spot and avoid deepfake scams include: 

  • Video/Audio that feels "off": Look for strange blinking, stiff facial movements, lip-sync mismatches, or robotic deepfake voice patterns.
  • Inconsistent backgrounds or lighting: In an AI video, the face may not match the lighting or shadows in the rest of the frame.
  • Unfamiliar communication methods: Be wary of anyone connecting via a new phone number, email, or app and claiming to be a known contact.
  • Unusual requests for urgent action: Sudden demands for money or sensitive info, especially those involving threats or emotional manipulation.

As with most scams, it’s always better to take your time and verify the information before taking action. PissedConsumer posts regular Scam Alerts and trustworthy reviews to help you stay ahead of recent scams. 

There are also several software tools that can help consumers detect deepfake fraud by analyzing audio, video, and images for signs of manipulation. These include Reality Defender, a real-time browser extension, and Microsoft Video Authenticator, offering confidence scores on video authenticity.

Protect Yourself From Deepfake Scams

As AI-enabled deepfake technology becomes more accessible and convincing, fraudsters are finding new and increasingly manipulative ways to scam their victims by threatening them emotionally, financially, and reputationally.

Recognize these tactics? Spotted a suspicious video that tricked you or someone else? Report the deepfake scam to raise awareness.

Legal disclaimers:

  1. While every effort has been made to ensure the accuracy of this publication, it is not intended to provide any legal, medical, accounting, investment or any other professional advice as individual cases may vary and should be discussed with a corresponding expert and/or an attorney.
  2. All or some image copyright belongs to the original owner(s). No copyright infringement intended.

Leave a Reply

Terms of Service