Facebook Says It’s Putting an End to Revenge Porn Once and For All




Nonconsensual porn, otherwise known as “revenge porn,” is a fairly widespread phenomenon, and research shows that its psychological effects can be devastating: according to one study, 80 to 93 percent of revenge porn survivors report struggling with paranoia, anger and shame after their nude photos or videos are posted without their consent, and another survey indicates that 51 percent of revenge porn survivors have had suicidal thoughts.

Given how devastating the impact of nonconsensual porn can be, it stands to reason that most large social networks would devote a fair amount of their resources to combatting the issue. On Friday, Facebook announced that it would be rolling out a brand-new tool on the platform to combat nonconsensual porn: essentially, an AI tool that would determine whether an image was uploaded with the full consent of the subject or not. The problem is, we don’t really know how that would actually work.


According to a blog post by Antigone Davis, the head of safety control at Facebook, the software would “proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.” The goal of this would be for Facebook to find and identify the content before the subject can report it, which would be tremendously helpful because “often victims are afraid of retribution so they are reluctant to report the content themselves, or are unaware the content has been shared,” Davis writes.

The image would then be reviewed by a (human) member of the Facebook team to confirm that it was posted without the subject’s consent; if it was, the image would be removed, and the poster’s account would be disabled.

There are, of course, a number of questions raised by this, the chief one being: given that a sizeable percentage of actual human men don’t actually know what consent is, then how would a machine be able to determine whether a pornographic image was shared with a subject’s permission?

While the post itself is mum on that front, in a statement to Gizmodo Facebook said that the software is “trained on revenge porn in order to better understand what these types of posts would look like.” CNBC also suggests that the software would be able to recognize nonconsensual pornographic images because they would be “coupled with derogatory or shaming text that would suggest someone uploaded the photo to embarrass or seek revenge on someone else.”

But it’s difficult to imagine a world in which even the most sophisticated software would be able to discern intent behind a Facebook photo, and given Facebook’s track record when it comes to discerning the difference between “offensive” and “non-offensive” content, it’s easy to see how such broad criteria could backfire. (We’ve reached out to Facebook for clarification, and will update if we hear back.)