Pinterest identifies 1,403% more child abuse material in 2022, but majority of reports come from Facebook
CTV
Major social media sites and digital platforms reported a nine per cent increase in suspected child sexual abuse material in 2022, with 85.5 per cent of all 31.8 million reports coming from Meta platforms Facebook, Instagram and WhatsApp.
Major social media sites and digital platforms reported a nine per cent increase in suspected child sexual abuse material in 2022, with 85.5 per cent of all 31.8 million reports coming from Meta platforms Facebook, Instagram and WhatsApp.
"These figures are rising either due to an increase in distribution of this material by users, or because companies are only now starting to look under the hood of their platforms, or both," Lianna McDonald, executive director of the Canadian Centre for Child Protection, said in a news release.
The data comes from the U.S. National Center for Missing and Exploited Children (NCMEC). Both Canada and the U.S. legally require electronic service providers in their countries to report and remove instances of apparent child pornography when they become aware of it in their systems. There are, however, no legal requirements for companies to proactively search for abusive content or use prevention tools to stop it from being uploaded.
"Millions of CyberTipline reports every year, mostly submitted by a handful of companies, is evidence that what we know about the extent of child sexual exploitation online is just the tip of the iceberg," a NCMEC spokesperson told CTVNews.ca, referring to the U.S. reporting program it operates. "Most tech companies around the world choose not to proactively detect and report child sexual exploitation on their networks."
Meta filed 27.2 million reports to CyberTipline in 2022, including 21.2 million from Facebook, five million from Instagram and one million from WhatsApp; a 1.1 per cent increase over 2021. Facebook alone accounted for 66.6 per cent of all reports in 2022.
Facebook is the most popular social media platform in both Canada and the U.S., with roughly three-quarters of adults using it. In a statement to CTVNews.ca, a Meta spokesperson said that the company actively invests in teams and technology to detect, prevent and remove harmful content with tools like AI and image-scanning software.
"We remove 98 per cent of this content before anyone reports it to us and we find and report more (child sexual abuse material) to NCMEC than any other service," Antigone Davis, Meta's head of safety, said in a statement to CTVNews.ca. "We’re committed to not only removing (child sexual abuse material) when we discover it but building technology to help prevent child exploitation from happening in the first place."