UHG
Search
Close this search box.

How to Combat AI-Generated Nude Photos with StopNCII

Available worldwide,once you submit the original and morphed versions of the photo, StopNCII makes sure that the altered image is removed from all online platforms.

Share

Deep Fakes
Table of Content
Listen to this story

The problem of image-based sexual abuse through non-consensual deepfake pornography has always been a problem. But now with hyper-realistic generative image models like Stable Difussion, Midjourney, and alike, it is easier than ever.

But we don’t have any strict laws to protect the victim from the mishap. However, to get a quick and easy solution, in a scenario where an individual employs AI or Photoshop to manipulate your image into a nude photo, you can take steps to address this situation by visiting StopNCII.  This is available worldwide.

You can upload both the original and altered versions of the photo. Once submitted, the altered image will be effectively removed from all online platforms, and you won’t need to engage in direct communication. Your privacy will be fully safeguarded.

How Are They Doing This?

Back in 2021, Meta, along with 50 other global NGOs, aided UK Revenge Porn Helpline in launching StopNCII.org. It combats online non-consensual sharing of private images, empowering global users to proactively secure intimate images on tech platforms using on-device hashing for safety and privacy.

The tool employs advanced hash-generating technology, assigning a unique numerical code to images to create a secure digital fingerprint. Tech companies involved with StopNCII.org use these hashes to detect sharing of these images on their platforms.

Participating companies use the hashes from StopNCII.org to identify shared images on their platforms, ensuring the original images remain on the user’s device. Only hashes, not the images, are shared with StopNCII.org and tech platforms, preventing further distribution of sensitive content and maintaining ownership.

StopNCII.org aids adults over 18 concerned about the non-consensual sharing of intimate images. For those under 18, alternative resources like the National Center for Missing & Exploited Children (NCMEC) provide appropriate support.

Vicious Loop of Deepfakes

According to a report, 96% of nonconsensual deepfake videos online involve women, primarily celebrities, transformed into sexual content without their permission.

“The rise of AI-generated porn and deepfake porn normalises the use of a woman’s image or likeness without her consent,” Sophie Maddocks, a researcher at the University of Pennsylvania tracking image-based sexual abuse, told AFP.

Emma Watson, Kristen Bell, Natalie Portman, Taylor Swift and other actors have been through this. However, it is not restricted to celebrities. 

Indian Journalist Rana Ayyub revealed in a terrifying post how she became the victim of deep fake porn after she took a stand on the Kathua gang rape in 2018. American Twitch streamer who goes by the gaming name QTCinderella became the latest victim of deepfake porn as she was harassed by people sending her copies of the deep fakes depicting her.

📣 Want to advertise in AIM? Book here

Related Posts
19th - 23rd Aug 2024
Generative AI Crash Course for Non-Techies
Upcoming Large format Conference
Sep 25-27, 2024 | 📍 Bangalore, India
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.
Flagship Events
Rising 2024 | DE&I in Tech Summit
April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore
Data Engineering Summit 2024
May 30 and 31, 2024 | 📍 Bangalore, India
MachineCon USA 2024
26 July 2024 | 583 Park Avenue, New York
MachineCon GCC Summit 2024
June 28 2024 | 📍Bangalore, India
Cypher USA 2024
Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA
Cypher India 2024
September 25-27, 2024 | 📍Bangalore, India
discord icon
AI Forum for India
Our Discord Community for AI Ecosystem, In collaboration with NVIDIA.