The Facebook team announced new tools to help detect and report child abuse content.
To do this, they have carried out a series of studies with the advice of experts to develop specific functions taking into account different contexts.
New Facebook tools to protect against child exploitation
After analyzing more than one hundred accounts reported to NCMEC, Facebook developed new policies and tools to eradicate content associated with child exploitation, which are being integrated into the platform.
On the one hand, Facebook is implementing a new dynamic when it detects that a user is non-maliciously sharing content related to child exploitation. For example, because the content is part of a viral post.
In that case, Facebook will show a security alert explaining to the user that the content violates the platform’s policies and that sharing it has legal consequences. Of course, the content will be removed, and Facebook mentions that it will be reported to NCMEC.
If the account continues to share this type of content, it will be removed from Facebook. And on the other hand, it will also activate alerts when it detects that a user is searching on Facebook with terms related to child exploitation, as shown in the image above.
And to complement this new system, it also modified its policies to detect content, which is intended to deceive Facebook’s detection, which uses harmless images in a context that clearly relates to child abuse and exploitation.
Not only the content will be deleted, but also all the profiles, groups or pages that publish this type of images. And to make it easy for users to report posts with this type of content, both Facebook and Instagram updated the reporting section by adding the option involves a child. This will allow moderators to prioritize these reports.