Reviewed against FBI.gov and CISA.gov guidance ยท AI-assisted content โ see our editorial standards
Quick answer
Quick answer: A deepfake is a video, audio recording, or image that has been manipulated by AI to make someone appear to say or do something they never did. The name combines "deep learning" (a type of AI) with "fake." They are used to spread misinformation, impersonate public figures, and increasingly to scam individuals. The best defense is skepticism โ verify surprising or upsetting videos through a trusted news source before sharing or acting on them.
In 2024, a finance worker at a multinational company was tricked into transferring $25 million after attending a video call with what appeared to be colleagues โ all of whom were deepfakes. The FBI has warned that deepfake scams targeting individuals are rising sharply, with older adults among the most frequently targeted groups.
You do not need to become a technology expert to protect yourself. You just need to know what to look for.
96%
of deepfake videos online are non-consensual โ made without the subject's knowledge
Source: Sensity AI 2024
$25M
lost in a single deepfake video call scam in 2024
Source: FBI IC3 2024
[stat|8 hrs|time it took to create a convincing deepfake video in 2018 โ now takes minutes]
What exactly is a deepfake?
The word "deepfake" covers several related technologies:
Video deepfakes put someone's face onto another person's body, or alter a real video to make the person appear to say something different. A politician can be made to give a speech they never gave. A family member can appear to make a distressing video call.
Audio deepfakes clone someone's voice using as little as a few seconds of real audio. The cloned voice can then say anything. This is the technology behind the grandparent scam calls we cover in our voice cloning guide.
Image deepfakes generate or alter still photos. A person can be placed at a location they were never at, or a real photo can be doctored to add or remove details.
Document deepfakes use AI to generate fake official-looking documents โ letters, invoices, contracts, medical records โ that look authentic but are fabricated entirely.
💡 Tip
Not every manipulated image or video is a deepfake. Simple photo edits, spliced audio, or staged videos have existed for decades. "Deepfake" specifically refers to AI-generated manipulation, which is harder to detect and faster to produce than traditional methods.
How deepfakes are used to scam people
The FBI identifies four main ways deepfakes are used against individuals:
The grandparent scam, upgraded. A scammer clones a grandchild's voice from social media and calls grandparents claiming to be in trouble. We cover this in detail in our voice cloning scams guide.
Fake video evidence. A scammer sends a doctored video "proving" something that isn't true โ a family member doing something embarrassing, a fake legal notice, or a fabricated news clip โ to create panic and pressure someone into paying.
Romance scam video calls. Someone who has been communicating with a "romantic interest" online is offered a video call. The person on the call is a deepfake, maintaining the illusion of a real relationship to continue extracting money.
Impersonating authority figures. A video call shows what appears to be a government official, bank representative, or even a family doctor โ created to add credibility to a scam demand.
⚠ Important
If someone sends you a video that seems designed to make you feel urgent, frightened, or ashamed โ stop. Do not forward it. Do not act on it. Verify it through a separate channel before doing anything. The goal of these videos is to short-circuit your ability to think clearly.
How to spot a deepfake โ 6 warning signs
Deepfakes have improved dramatically but still leave tell-tale signs. Look for:
Unnatural blinking
early deepfakes rarely blinked; newer ones over-blink or blink at odd moments. Watch the eyes closely.
Lip sync that's slightly off
the mouth movements don't quite match the words, especially on fast or complex sounds like "f," "v," and "th."
Blurry or soft edges around the face
the boundary between the face and hair, or face and background, often looks slightly smudged or unnaturally smooth.
Lighting that doesn't match
the face may appear lit differently from the surrounding environment, or shadows fall in the wrong direction.
Unnatural skin texture
deepfake skin often looks too smooth (no pores, no fine lines) or has a slightly waxy, plastic quality.
Background inconsistencies
objects in the background may blur, warp, or disappear as the person moves.
🚫 Rule
No single sign proves a deepfake. Real videos can have technical glitches. But if multiple warning signs appear together โ especially in a video that is asking you to feel urgent and act fast โ treat it as a deepfake until proven otherwise.
What to do when you suspect a deepfake
Step 1: Do not share it. Sharing deepfakes โ even to warn others โ spreads them further and may harm the person being impersonated.
Step 2: Search for the original. Paste a description of the video into a search engine. If it is a manipulated version of something real, the original often turns up quickly. Reputable news organizations will have covered major events.
Step 3: Reverse image search. For still images, use Google Images or TinEye. Upload the image or paste the URL to find where else it has appeared.
Step 4: Check a deepfake detector. Free tools like Microsoft's Video Authenticator, Deepware Scanner, and FakeCatcher analyze videos for AI manipulation signatures. Search "deepfake detector" to find current options.
Step 5: Report it. If the deepfake is being used to scam you, report to the FBI's Internet Crime Complaint Center at ic3.gov. If it involves someone being impersonated, that person deserves to know.
A real example
Barbara, 74, from Ohio, received a video on Facebook that appeared to show her favorite news anchor endorsing a too-good-to-be-true investment opportunity. The anchor's mouth moved naturally, the voice sounded right, and the logo of a real TV station appeared in the corner.
She almost clicked the link. Then she noticed the anchor's eyes never quite blinked right, and there was a slight blurring around the hairline. She searched the anchor's name on the news station's website โ no such segment existed. She reported the post to Facebook and warned her friends' group.
Per guidance from the FTC, this type of celebrity impersonation scam โ often using deepfake video โ cost Americans over $1 billion in 2023 alone.
Frequently asked questions
Can I make a deepfake of myself?
Yes, several legitimate apps let you do this for entertainment โ face-swapping apps, voice cloning for accessibility purposes, and so on. The technology itself is not illegal. Creating deepfakes of other people without their consent, especially to harass or defraud, is illegal in a growing number of states.
Are deepfakes getting easier to make?
Yes, significantly. What required expensive computers and weeks of work in 2018 now takes minutes on a standard laptop with free or cheap software. This is why the FBI and FTC have both escalated their warnings in 2025 and 2026.
How do I protect my own image from being used in a deepfake?
You cannot prevent it entirely, but you can reduce risk. Set social media accounts to private. Limit the amount of video and audio you post publicly. Be aware that anything you post publicly can potentially be used as source material.
What should I do if a deepfake of me is shared online?
Contact the platform directly to report impersonation โ most major platforms have specific policies against non-consensual deepfakes and will remove them. Document the content (screenshot, save URL) before reporting. Contact local law enforcement if the deepfake is being used to extort or harass you. The FBI's IC3 also accepts these reports.
Is the deepfake problem going to get worse?
Per CISA (the Cybersecurity and Infrastructure Security Agency), deepfake technology is advancing faster than detection technology. The most reliable protection is not detection software โ it is a personal habit of skepticism and verification before acting on any surprising or emotionally charged video or audio.
Related reading: