Facebook uses technology and people to thwart ‘revenge porn’

Revenge porn, when someone shares an intimate image of another person without his or her permission, is an awful practice, which often causes significant emotional distress.  And that’s true even if the image was taken with the consent of the person but shared without permission. In a blog post, Facebook’s Global Head of Safety, Antigone Davis, cited a data that “93% report significant emotional distress and 82% report significant impairment in social, occupational or other important areas of their life.”

Personally I think we need a better term to describe this material. It isn’t necessarily about revenge and it’s not necessarily porn. It’s about a betrayal of trust.

Facebook is trying to keep these images off its platforms by encouraging people to report it and by using technology to match images that have been previously reported and then prevent them from being viewed on the service.

Davis outlined four strategies that the company is using to combat the spread of non-consensual sexual images:

  • Encouraging people to report it if they see it
  • Employing specially trained people in their operations team to review and remove images and block offending accounts
  • Using photo-matching technologies to help thwart further attempts to share the image on Facebook, Messenger and Instagram.

Davis said that “If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it.” She said that Facebook is also partnering  “with safety organizations to offer resources and support to the victims of this behavior.” My nonprofit, ConnectSafely.org, is one of those organizations. Others include Cyber Civil Rights Initiative,  the National Network to End Domestic Violence, the Family Online Safety Institute.

Also see:re