The users that don’t want certain images to be shared on the social networking website without their consent can send to Facebook the said intimate images for the site to automatically “hash” them.
This means that the AI-based system scans the picture to turn it into a blueprint that can recognize the photo in any attempt to share it. As a result, the image is blocked whenever somebody wants to share or re-upload it online, a practice known as revenge porn.
Facebook is running the pilot program on its Australian platform with the help of a government agency that promised to empower revenge porn victims and allow them fight back before the images are uploaded on Instagram, Messenger, or Facebook.
Advocates Hail the Move
Australian authorities noted that an intimate photo or video was probably taken consensually at one point, but one of the partners may not want that content to be distributed via Facebook for everyone to see.
Sexual privacy advocates praised the move since the program could help not only the actual victims but those who are concerned they may too become victims in the future.
Users that want to be part of the program are required to fill in an online form on the Australian agency’s website, mentioning their concerns. Next, they’ll have to submit the compromising pictures to the agency on Messenger. Australian authorities will then brief the social network about each individual case.
A Facebook employee will take a look at the image or video and ensure that it is never uploaded on the website. Facebook promised to delete the content after a short period of time.
Image Source: Pxhere