Facebook launches campaign against 'revenge porn' with its Report tool and photo-matching technology
The company will thwart the circulation of intimate images on their platforms without permission, once reported.
Facebook has announced measures to prevent “revenge porn” content from being shared on Facebook, Messenger and Instagram, a company statement said on Wednesday. The company said users can report intimate images being shared on Facebook-owned platforms to stop the content from being circulated online. The social media giant said the step was part of its endeavour to create a “safe community on and off Facebook”.
The statement said photo-matching technology would be used to stop the circulation of intimate images without consent. Facebook is also collaborating with safety organisations to offer “resources and support to the victims of this behavior”.
Quoting a study of United States-based victims, whose intimate images were shared without their consent, the statement said 93% of the complainants suffered severe emotional distress while 82% said they underwent “impairment in social, occupational or other important areas of their life”.
Facebook asked users to use the “Report” link next to a post when they see an objectionable image of themselves on platforms owned by the company. A team will review and remove the image if it violates the firm’s Community Standards and may even disable the account of the person sharing the images. They will offer the individual accused of sharing the contentious photos the option of appealing their case
The statement said users could refer to its guide which deals with such matters. Facebook said that it had sought feedback from 150 intenret safety organisations and experts in Kenya, India, Ireland, Washington DC, New York, Spain, Turkey, Sweden and the Netherlands. The decision is also based on inputs from the National Network to End Domestic Violence, Center for Social Research, the Revenge Porn Helpline (United Kingdom) and the Cyber Civil Rights Initiative.