Facebook launches new AI detection tool to help revenge porn victims
Facebook is launching new detection technology that uses AI (artificial intelligence) to detect nude images, in a bid to better protect victims of revenge porn.
The social network is using machine learning to identify near nude images or videos that have been shared on both Facebook and Instagram, which will then be sent to a reviewer to decide whether the content violates its standards.
In most cases, the offending account will be disabled, Facebook said, but there will be an appeals process for anyone who believes their account has been wrongly blocked.
At present, victims have to report non-consensual intimate images as they spot them, but the social network believes many are reluctant to do this for fear of retribution.
Facebook and other social networks have long faced criticism for being too slow in removing illegal material from platforms.
“The sharing of intimate images online can have serious emotional and physical consequences for the people whose photographs were posted,” said Facebook’s Radha Iyengar, head of product policy research and Karuna Nain, global safety policy programs manager.
“Sometimes called ‘revenge porn’, it’s really a form of sexual violence that can be motivated by an intent to control, shame, humiliate, extort and terrorise victims.”
Revenge porn is illegal in the UK, where those found guilty can face a fine or even imprisonment.
The move builds on a previous trial scheme carried out by Facebook, in which users can send any intimate image to the social network that they fear could be made public to create a unique fingerprint of the photo, before deleting it from its servers.
The fingerprint enables Facebook to identify the image without keeping a copy of it should anyone try to upload it to Facebook, Messenger or Instagram – and block it from being posted.
“When images and videos are shared online without the person’s consent the impact can be devastating and can leave people feeling distressed and humiliated,” said Diana Fawcett, chief officer at independent charity Victim Support.
“Once images are put online and made public, victims have very little control over where they end up and who sees them and this is likely to leave people feeling extremely powerless.
“It’s so important that revenge porn is taken seriously and that more measures are taken to protect victims.”