While Facebook is currently on a full frontal assault against hate speech, explicit content and fake accounts, there is another category that it will be taking responsibility for removing. Revenge porn.
According to an announcement from Facebook 's Global Head of Safety, Antigone Davis, Facebook will soon be implementing a new form of AI. This new AI will be targeting "non-consensual intimate images" or "revenge porn" of people to protect them against public shame and online abuse.
The move from Facebook to stop "revenge porn" is to minimilze the amount of online abuse its userbase is currently experiencing. A survey that consisted of a sample size of 1606 people had "61% of respondents said they had taken a nude photos/videos of themselves and shared it with someone else" and that "23% of respondents were victims of revenge porn." The survey also details that "93%" of the victims suffered "significant emotional distress" and "42% sought out psychological services." Facebook has created a new hub called "Not without my consent" for victims to report their images/video, a link to that can be found here.