Google has teamed up with the U.K.-based nonprofit StopNCII to strengthen its initiatives against the distribution of nonconsensual intimate photos and videos, commonly referred to as revenge porn.
The tech giant will start utilizing hashes from StopNCII—essentially unique digital signatures for photos and videos—to actively detect and eliminate non-consensual intimate material from its Search platform.
StopNCII enables adults to safeguard their personal photos from being posted online by generating a specific hash—a digital marker—for each intimate image. These hashes are then shared with partner sites such as Facebook, which can then automatically detect and remove matching images from their platforms.
Importantly, the actual intimate images never leave the user’s device; only the hash is uploaded to StopNCII’s servers.
“Our current tools enable individuals to request the removal of NCII from Search, and we’ve continued to introduce ranking updates that make this kind of content less visible,” Google stated in a blog entry. “We’ve also received feedback from survivors and advocates indicating that, given the scale of the open web, there is still more work to do to lessen the burden on those impacted.”
Google has only recently adopted StopNCII’s system; this collaboration arrives about a year after Microsoft brought the tool to Bing. Other platforms working with StopNCII include Facebook, Instagram, TikTok, Reddit, Bumble, Snapchat, OnlyFans, X, and others.
This partnership with StopNCII is Google’s latest action to address nonconsensual intimate image issues. In the past year, Google introduced new ways to help people remove deepfake nonconsensual images from Search and made it more difficult for such content to appear.