Latest Breaking News & Top Headlines

Meta launches tool to prevent revenge porn from spreading on Facebook and Instagram

0

Meta launches tool to stop revenge porn from spreading on Facebook and Instagram – but users afraid of victimization must submit images and videos to a hashing database to defend a case

  • The tool is a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images’
  • People concerned about their intimate photos or videos that are or could be posted on Facebook or Instagram can create a case through the website
  • You do this by uploading the images or videos to the website
  • Meta says the content is converted into a digital fingerprint, allowing them to identify or detect the explicit content and claims that no human eyes see them










Meta rolled out a new tool Thursday that prevents revenge porn from spreading on Facebook and Instagram, but it requires people to submit their sexually explicit photos and videos to a hashing database.

If anyone is concerned about their intimate images or videos that have been or could be posted on any of the social media platforms, they can make a case through a global website called StopNCII.org, which stands for “Stop Non-Consensual Intimate.” images’.

Each photo or video submitted receives a digital fingerprint, or unique hash value, which is used to detect and track the copy that was shared or attempted to post without the person’s consent.

However, the website was created with 50 global partners and sharing intimate images and videos of yourself with a third-party website may not be good for most users, but Meta says they “will not have access to or copies of the original images will save. ‘

A Meta spokesperson told DailyMail.com in an email that “only the person who files a case with StopNCII.org can access their images/videos” and “all the calculations required to calculate the hash of an image in the browser, meaning ‘images never leave the person’s device.’

“Only cases are submitted to StopNCII.org and hashes of the person’s images or videos are shared with participating tech companies such as Meta,” the spokesperson added.

Scroll down for video

Meta rolled out a new tool on Thursday that prevents revenge porn from spreading on Facebook and Instagram, but it requires people to post their sexually explicit photos and videos on a website.

“Only hashes, not the images themselves, are shared with StopNCII.org and participating technology platforms,” ​​Antigone Davis, global head of safety for Meta, said in a statement. blog post.

“This feature prevents further distribution of that NCII content and keeps those images safe in the possession of the owner.”

StopNCII.org builds on a pilot program launched in Australia in 2017, when it asked the public for photos of itself to create hashes that could be used to detect similar images on Facebook and Instagram.

And this foundation is used to stop revenge porn.

If anyone is concerned about their intimate photos or videos posted or may have been posted on any of the social media platforms, they can make a case through a global website called StopNCII.org

If anyone is concerned about their intimate photos or videos posted or may have been posted on any of the social media platforms, they can make a case through a global website called StopNCII.org

Meta originally planned to set up Facebook to let people send their intimate images or videos to prevent them from spreading, but the sensitive media would have been reviewed in the process by human moderators before being turned into unique digital fingerprints. NBC News reports.

Knowing this, the social media company chose to engage a third party, StopNCII, which specializes in image-based abuse, online safety and women’s rights.

“Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,” ​​Davis wrote.

“This feature prevents further distribution of that NCII content and keeps those images securely in the possession of the owner.

StopNCII.org is intended for adults over the age of 18 who believe that an intimate image of them may be, or has already been shared, without their consent.

A report in 2019, released by NBCMeta identifies nearly 500,000 cases of revenge porn every month.

To deal with the influx of revenge porn, Facebook employs a team of 25 people who, along with an algorithm developed to identify nudes, help vet reports and remove photos.

But with StopNCII.org, human moderators are replaced with hashes that can detect and identify the images – after potential victims share the explicit content with Meta.

WHAT IS WRASPORNO ON FACEBOOK?

Much of what is revenge porn falls under Facebook’s rules about nudity.

However, in March 2015, the social network released specific community guidelines to address the growing problem of revenge porn.

The section ‘Sexual Violence and Exploitation’ deals specifically with the subject.

The guidelines state: ‘We remove content that threatens or promotes sexual violence or exploitation.

This also includes sexual exploitation of minors and assault.

“To protect victims and relatives, we also remove photos or videos of incidents of sexual violence and images shared in revenge or without the consent of the people in the images.

“Our definition of sexual exploitation includes solicitation of sexual material, any sexual content involving minors, threats to share intimate images, and offers of sexual services.

“We will refer this content to law enforcement where appropriate.”

Advertisement

.

Leave A Reply

Your email address will not be published.