Meta has launched a new tool that prevents revenge porn on Facebook and Instagram. It requires users to upload sexually explicit videos and photos to a website.
If someone has concerns about their private images and videos being posted on social media, they can file a complaint through StopNCII.org. This stands for Stop Non-Consensual Intimate Images.
Every photo and video that is uploaded gets a unique digital fingerprint (or hash value), which allows you to identify the copy of it shared or attempted post.
Although the site was built with 50 partners around the world, sharing images and videos with any third party website is not a good idea. Meta claims that users ‘will never have access to nor store duplicates of the original images’.
Scroll down to see the video
Meta launched a new tool Thursday to stop revenge porn spreading via Instagram and Facebook. However, it asks users to upload explicit sexually explicit images and videos to a website
Antigone Davis (global head of safety at Meta) shared this in a blog: “Only hashes and not images are shared with StopNCII.org or participating tech platforms,”
“This function prevents the further dissemination of NCII material and secures images for the owner.
DailyMail.com contacted Meta regarding the safety of this tool, but has not received a reply.
StopNCII.org is based on a 2017 pilot program in Australia that asked for photographs of the general public to generate hashes to be used in detecting similar images on Instagram and Facebook.
It is this foundation that is used to stop the vengeance porn.
StopNCII.org allows anyone to create a case if they’re concerned that their videos or intimate photos have been uploaded or may be posted to any of these social media platforms.
Meta initially planned to create Facebook so people could upload photos and videos. To stop the media from spreading, however, the sensitive media would first have to be reviewed by human moderators.
The social media company realized this and decided to bring in StopNCII, a third party that specializes in online safety, image-based abuse, and women’s rights.
Davis stated that only hashes and not images are shared with StopNCII.org or participating tech platforms.
“This feature stops further distribution of NCII content, and preserves those images in the ownership of the owner.
StopNCII.org was created for people over 18 who believe that an intimate photograph of themselves may be shared or has been shared without their permission.
Meta, a report released by NBC in 2019, identifies almost 500,000 cases every month of revenge porn.
Facebook has 25 employees to deal with revenge porn. This team, along with an algorithm that identifies nude images helps vet reports.
But with StopNCII.org, human moderators are replaced with hashes that can detect and identify the images – after potential victims share the explicit content with Meta.