221% Rise in Online Child Sexual Exploitation Cases: Why a Ban Alone Is Not Enough to Protect Our Children
At a time when global reports reveal that one in three internet users is a child under 18, Jordan’s first Children’s Rights Country Report, recently issued by the National Council for Family Affairs NCFA, shows a 221% increase in reported cases of online child sexual exploitation over the past three years.
This figure invites two interpretations. On one hand, it may reflect a positive trend: rising awareness and stronger monitoring and reporting mechanisms, meaning more cases are finally being uncovered after years of remaining hidden. On the other hand, and more importantly, it is a stark warning. Crimes of this nature are among the most underreported globally. Estimates by UNICEF suggest that reported cases represent, at best, only 5 to 10% of the actual scale. In other words, what we are seeing is likely just the tip of the iceberg, while the vast majority of abuse remains hidden behind screens, silenced by fear, stigma, and the lack of safe reporting channels.
These findings come at a moment of growing policy attention. The government has established a national committee to protect children and adolescents from the risks of social media, while public calls for stricter regulation continue to intensify, sometimes extending to demands for banning social media and AI for those under 16. This raises a more fundamental question: do we need to restrict children’s access to the internet, or do we need to make the internet safer for children?
The uncomfortable truth is that today’s digital environment is built around the adult user and driven primarily by commercial growth. Algorithms prioritise what captures attention, not what protects well-being. Safety features are often difficult to find or disabled by default. Reporting systems are slow or ineffective. Meanwhile, data is collected at scale, even from young users.
These are not technical oversights; they are deliberate design choices. The harm children experience online is not accidental. It is predictable and, crucially, preventable. This is why the debate must shift from access to design.
Globally, this shift is beginning to take shape, not only in public discourse but also in legislation. One notable example is Brazil’s approach through what is known as the "ECA Digital framework". Rather than imposing a blanket ban, it focuses on holding technology companies accountable for the environments they create. The premise is simple but transformative: safety and privacy must be built into digital products from the outset, not added later in response to harm or public pressure.
In practice, this means requiring companies to assess the impact of their platforms on children’s rights, invest in human oversight and rapid response systems, and eliminate exploitative and addictive design features. It also means minimising data collection, prohibiting the commercial profiling of children, ensuring accessible reporting mechanisms, and providing effective remedies when harm occurs. Transparency around risks and failures is no longer optional.
What such approaches do is move the conversation from good intentions to legal accountability. Child safety can no longer be treated as a matter of corporate goodwill or a feature to be improved over time. Technology companies generate billions of dollars from user attention and data, including that of children. In this context, claims of limited resources for child safety are difficult to justify.
In the local context, quick fixes such as bans or broad restrictions may seem appealing, but they address symptoms rather than causes. Children will always find ways to go online. The more pressing question is: what kind of internet are they entering? One designed to exploit their attention, data, and vulnerability, or one intentionally built to protect them?
The issue is no longer whether children are online. It is the kind of online world we are exposing them to!
Nadine Nimri, a Jordanian Journalist, is an advocacy and communications strategist












































