The UAE’s Digital WellBeing Council, the Ministry of Interior’s Child Protection Center, and Meta have launched a campaign to educate the public about the harm caused by sharing images or videos of child sexual abuse, and how to report such content.
How does your adtech look for 2022? No matter what stage your planning is at, our final Online Briefing of the year has you covered. Join us on November 29 for Adtech Playbook – A Roadmap to Superior Performance. Find out more here.
The campaign, which launches on World Children’s Day (November 20), is informed by research conducted earlier this year by Meta and the world’s leading experts on child exploitation, including the National Center for Missing and Exploited Children (NCMEC) and Professor Ethel Quayle, a leading clinical psychologist who specializes in sex offenders.
To date, much of the research on why people engage with child sexual abuse materials has involved evaluations of people’s psychological make-up. However, Meta’s research looks at behavioural signals from a fixed point in time and from a snapshot of users’ life on Meta’s platforms.
Researchers evaluated 150 accounts that Meta reported to NCMEC for uploading child exploitation content between July and August of 2020 and January 2021, and found that more than 75% did not exhibit malicious intent (i.e. did not intend to harm a child). Instead, these accounts appeared to share for other reasons, such as outrage or poor humour.
Meta reports each individual instance of child exploitation content to NCMEC, including content the company has identified and removed using technology before it has been seen by anyone on Meta platforms. The study also found that the majority of reports Meta sent to NCMEC were for the same or visually similar content. Ninety per cent of the images or videos of child sexual abuse analysed in the study were found to be copies, rather than unique or new content.
In addition, just six pieces of visually distinct media were responsible for more than half of all child-exploitative content that the company reported to NCMEC in the same period.
Based on this analysis, the company developed this campaign together with child safety partners to help reduce instances of child exploitation content being shared on Meta platforms.
“While this data indicates that the number of pieces of content does not equal the number of victims, one victim is one too many. Preventing and eradicating online child sexual exploitation and abuse requires a cross-industry approach, and Meta is committed to doing our part to protect children on and off our apps. We are taking a research-informed approach to develop effective solutions that disrupt the sharing of child exploitation material,” said David Miles, Meta Head of Safety Policy, EMEA.
“No matter the reason, sharing images or videos of child sexual abuse online has a devastating impact on the child depicted in that content. We are working with Meta to get a better understanding of how we can effectively disrupt sharing and prevent re-victimizing children, and also educating people on what they can do to report this crime,” said Abdul Rahman al-Tamimi, Director of Child Protection Center at the Ministry of Interior, United Arab Emirates.
Report It. Don’t Share It.
You can help a child by reporting child exploitation content. If a child is at risk, call and report it to the Ministry of Interior’s Child Protection Centre Helpline 116111 If you see images on Facebook or Instagram of a child being abused, report the photo or video to Meta and law enforcement. Do not share, download, or comment on the content. There can be criminal penalties to share, or send messages with, photos and videos of children being sexually abused and exploited. You won’t be asked to provide a copy of the content in any report you submit to Meta.