A Wake-Up Call About Deepfake Dangers on Social Media
I’ve been talking about deepnude culture at all of my trainings for the past five years. More recently, I’ve been discussing how apps that “nudify” people—digitally removing clothing from photos using AI—are being marketed on platforms like Instagram and TikTok.
When I saw this new CBS News investigation uncovering that these apps are not just being used, but advertised directly on social media platforms our kids use daily, I knew I needed to share it with you.
A recent CBS News report revealed that Meta platforms—Facebook and Instagram—approved and promoted ads for AI-powered tools that generate fake nude images from real, fully-clothed photos. These apps are often marketed with phrases like “undress anyone,” and some were even shown to teen users.
What you need to know:
- These tools use artificial intelligence to create hyper-realistic fake nudes from real images.
- They’re being used for sextortion, bullying, harassment, and social humiliation.
- Despite violating Meta’s own ad policies, these ads were allowed to run.
Why this matters:
This isn’t about one bad app—it’s about how easily children and teens are exposed to dangerous, harmful content with just a few clicks. It raises major concerns about consent, safety, body image, and how fast deepfake culture is evolving in the hands of youth.
What you can do:
- Start a conversation with your child or students about deepfakes, image manipulation, and online consent.
- Monitor app downloads and talk openly about the realities of sextortion.
- Report inappropriate ads and advocate for greater platform accountability.
- Stay informed through Shape the Sky’s trainings and resources designed for adults guiding kids in this digital landscape.
Click this link for the CBS News investigation if you want to dig deeper.
Let’s continue to protect our kids—not through fear, but through awareness, communication, and action.
Ryan