Someone Made a Fake Porn Video of Me — What Can I Do?

 

Imagine waking up to messages saying there’s a pornographic video of you circulating online — but you’ve never made such a video. Welcome to the horrific reality of deepfake pornography, where AI technology is used to paste your face onto someone else’s body in explicit content.

If you’re here searching “Someone made a fake porn video of me,” you’re not alone. And more importantly — you have options.

What Is Deepfake Pornography?

Deepfake porn uses artificial intelligence (AI) to manipulate video content, often by superimposing someone’s face (yours) onto the body of another person in a pornographic video. The result is a disturbingly realistic video that can destroy reputations, relationships, and careers — even though it’s fake.

 

Indonesian Public Figure

In 2023, an Indonesian beauty influencer with over 1 million TikTok followers was targeted with a deepfake porn video, shared anonymously on Telegram. Despite the video being AI-generated, many viewers believed it was real. She lost brand sponsorships, received threats, and faced public humiliation. The case was eventually taken to Bareskrim (Cybercrime Division), but the mental damage remained.

Is This Illegal?

In Indonesia:

Yes — deepfake pornography falls under several laws:

  • ITE Law (UU No. 11/2008): Criminalizes the creation or distribution of electronic content that violates decency (Pasal 27 ayat 1).

  • UU TPKS (Law on Sexual Violence Crimes): Defines digital sexual violence and criminalizes distribution of sexually exploitative material.

  • Pornography Law (UU No. 44/2008): Prohibits creating and sharing pornographic material, especially non-consensual ones.

Penalties can include 6–12 years in prison and/or substantial fines.

In the United States:

  • Federal Protections (as of 2025): Under the newly passed “TAKE IT DOWN Act”, it’s a federal crime to knowingly create or distribute non-consensual AI-generated explicit content.

  • State Laws: California, Virginia, Texas, and others have criminalized deepfake pornography. Victims can sue under civil law or report it as a felony.

What to Do If Someone Made a Deepfake Porn Video of You

1. Don’t Panic — Document Everything

  • Save the URL, screenshots, usernames of uploaders, and platform details.

  • This evidence is critical for takedowns and legal reports.

2. Report the Video to the Platform

  • Facebook, Instagram, TikTok, Reddit, Telegram, and even Google have processes to report “non-consensual intimate content.”

  • Use terms like “deepfake pornography”, “AI-generated without my consent”, and include your full name and photos for identity verification.

3. File a Legal Report

  • In Indonesia, file a complaint at the Bareskrim POLRI (Cybercrime Unit).

  • In the U.S., go to local law enforcement or submit to the FBI’s IC3 portal if it’s online harassment or blackmail.

4. Seek Legal and Psychological Support

This is not just a digital issue — it’s emotional, reputational, and potentially criminal.

What Not to Do

  • Don’t pay anyone claiming to delete the video for a fee — many are scams.

  • Don’t engage with the uploader — it can lead to blackmail.

  • Don’t stay silent — the longer it spreads, the harder it is to contain.

How to Remove the Video Online

While removal depends on the platform, here are general steps:

  • Use the non-consensual imagery forms offered by Google and Meta.

  • For platforms like Telegram or niche porn forums, use legal notices.

  • Contact Bullyid.org to help you file requests, draft legal takedown letters, and trace where the content has spread.

How We Can Help

Bullyid is Indonesia’s leading nonprofit for victims of digital abuse, including deepfakes. We provide:

  • Free legal consultation

  • Mental health counseling with trauma experts

  • Help removing deepfake content from social media and search engines

  • Guidance on filing reports to police and cyber authorities

We’ve helped thousands of victims reclaim their digital identity and fight back with strength.