Imagine a world where the very platforms meant to connect us are also turning a blind eye to the most heinous crimes against children. Unfortunately, that’s the reality according to Australia’s eSafety Commissioner, who has called out major social media companies, particularly YouTube, for their inaction against child sex abuse material on their platforms.

In a damning report released on Wednesday, the eSafety Commissioner, Julie Inman Grant, highlighted how these tech giants, including Apple and YouTube, have failed to address the alarming number of user reports about child sex abuse material. Even more concerning, they couldn’t even provide basic statistics on how long it took them to respond to these reports, effectively leaving children vulnerable in an increasingly perilous digital landscape.

Following this alarming revelation, the Australian government made a historic decision to include YouTube in a social media ban aimed at teenagers, responding to the eSafety Commissioner’s advice to revoke an exemption that initially spared the platform from restrictions. This shift underscores a growing recognition that when left unchecked, these companies often prioritize profit over the safety of children.

“When left to their own devices, these companies aren’t prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services,” Inman Grant stated emphatically. “No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises or services.”

In a defense that many may find insufficient, Google has previously claimed that abuse material has no place on its platforms, touting their use of advanced techniques to identify and remove such content. Meanwhile, Meta, the owner of Facebook, Instagram, and Threads, which collectively hold over three billion users, insists it prohibits graphic videos, but questions remain regarding the effectiveness of enforcement.

The eSafety Commissioner’s office, designed to safeguard internet users, has mandated various tech giants, including Discord, Microsoft, and WhatsApp, to report on their measures to combat child exploitation and abuse material. However, the report has uncovered a series of disturbing “safety deficiencies” across these platforms, further endangering children. These gaps include the failure to detect or prevent the livestreaming of abusive content and inadequate reporting mechanisms.

Alarmingly, these platforms are not employing “hash-matching” technology throughout all services to identify images of child sexual abuse, limiting their ability to combat this crisis effectively. Despite Google claiming to utilize such technology, the eSafety Commissioner notes that some providers have not made adequate improvements to address these safety gaps, even after being warned in previous years.

Inman Grant expressed frustration over the lack of cooperation, particularly from Apple and Google: “They didn’t even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many trust and safety personnel Apple and Google have on-staff.”

This ongoing situation raises critical questions about the responsibilities of these tech giants and the urgent need for accountability in protecting our most vulnerable users. The time for action is now; the safety of children online can no longer be disregarded.