Listen to this story
|
The problem of image-based sexual abuse through non-consensual deepfake pornography has always been a problem. But now with hyper-realistic generative image models like Stable Difussion, Midjourney, and alike, it is easier than ever.
But we don’t have any strict laws to protect the victim from the mishap. However, to get a quick and easy solution, in a scenario where an individual employs AI or Photoshop to manipulate your image into a nude photo, you can take steps to address this situation by visiting StopNCII. This is available worldwide.
You can upload both the original and altered versions of the photo. Once submitted, the altered image will be effectively removed from all online platforms, and you won’t need to engage in direct communication. Your privacy will be fully safeguarded.
How Are They Doing This?
Back in 2021, Meta, along with 50 other global NGOs, aided UK Revenge Porn Helpline in launching StopNCII.org. It combats online non-consensual sharing of private images, empowering global users to proactively secure intimate images on tech platforms using on-device hashing for safety and privacy.
The tool employs advanced hash-generating technology, assigning a unique numerical code to images to create a secure digital fingerprint. Tech companies involved with StopNCII.org use these hashes to detect sharing of these images on their platforms.
Participating companies use the hashes from StopNCII.org to identify shared images on their platforms, ensuring the original images remain on the user’s device. Only hashes, not the images, are shared with StopNCII.org and tech platforms, preventing further distribution of sensitive content and maintaining ownership.
StopNCII.org aids adults over 18 concerned about the non-consensual sharing of intimate images. For those under 18, alternative resources like the National Center for Missing & Exploited Children (NCMEC) provide appropriate support.
Vicious Loop of Deepfakes
According to a report, 96% of nonconsensual deepfake videos online involve women, primarily celebrities, transformed into sexual content without their permission.
“The rise of AI-generated porn and deepfake porn normalises the use of a woman’s image or likeness without her consent,” Sophie Maddocks, a researcher at the University of Pennsylvania tracking image-based sexual abuse, told AFP.
Emma Watson, Kristen Bell, Natalie Portman, Taylor Swift and other actors have been through this. However, it is not restricted to celebrities.
Indian Journalist Rana Ayyub revealed in a terrifying post how she became the victim of deep fake porn after she took a stand on the Kathua gang rape in 2018. American Twitch streamer who goes by the gaming name QTCinderella became the latest victim of deepfake porn as she was harassed by people sending her copies of the deep fakes depicting her.