Tech Giants 'Fail' Children as 3,000 Abuse Images Sent to Midlands Youngsters
Tech Giants 'Fail' Kids as 3,000 Abuse Images Sent to Midlands

Tech Giants 'Failing' Children as Over 3,000 Abuse Images Sent to Midlands Youngsters

Technology companies are being accused of failing to protect children online after more than 3,000 crimes involving child sexual abuse images were recorded in the West Midlands region. The National Society for the Prevention of Cruelty to Children (NSPCC) has issued a stark warning, stating that children across the United Kingdom are being let down by firms that should be ensuring their digital safety.

Alarming Statistics Reveal Scale of the Problem

According to data obtained through a Freedom of Information request submitted by the NSPCC, a total of 3,154 offences relating to indecent and prohibited images of children were recorded in the West Midlands during the year ending March 31, 2025. This disturbing figure forms part of a wider national crisis, with police forces across the UK logging 36,829 similar offences during the same period.

The analysis shows a concerning year-on-year increase of nine per cent in these crimes. When examining the platforms where these offences occurred, the data reveals that Snapchat was implicated in more than 40 per cent of the 10,811 UK crimes where the platform could be identified. Meta platforms collectively accounted for almost a quarter of all offences at 24 per cent, with Instagram responsible for eight per cent, WhatsApp for seven per cent, Facebook for five per cent, and Messenger for four per cent.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Charity Demands Immediate Action and Technological Solutions

The NSPCC is now urging technology companies to implement existing technology that can block nude images from being created, shared, or viewed on children's devices. The charity argues that real-time blocking of illegal images would help prevent these crimes before they occur. "It is indefensible that we are still seeing around 100 child sexual abuse image offences recorded every single day," said Chris Sherwood, chief executive of the NSPCC.

Sherwood emphasized the human cost behind these statistics: "Behind every one of these offences is a child who has been groomed, abused and manipulated. They are left to carry the trauma, whilst tech companies continue to profit handsomely." He questioned why companies haven't deployed available protective technology already, stating: "The real question is: what's stopping them? If they continue to drag their feet, Government must show their might by stepping in and compelling them to act."

Personal Impact and Government Commitments

The devastating personal consequences of these failures are illustrated by the experience of a 17-year-old boy who spoke to Childline. He shared: "I shared a nude online and it was leaked, so everyone at school saw it. I was in a really bad way, so I moved schools. The nude pictures still come up as random people message me and blackmail me with them. I'm worried about my new friends seeing them and how the leaked nudes will impact my career in the future."

The UK Government has previously committed in its Violence Against Women and Girls (VAWG) strategy to collaborate with technology companies to prevent children from taking or sharing nude images. However, the NSPCC maintains that voluntary measures have proven insufficient and that stronger regulatory action is now necessary to protect young people from online exploitation.

The charity's call to action comes as technology companies face increasing scrutiny over their role in facilitating harmful content. With nearly 37,000 child sexual abuse image crimes recorded nationally last year, the NSPCC argues that the time for excuses has passed and that immediate implementation of protective technologies is essential to safeguard children across the Midlands and throughout the United Kingdom.

Pickt after-article banner — collaborative shopping lists app with family illustration