Urgent action is being demanded from technology companies as alarming new figures reveal online grooming crimes have reached record levels, with perpetrators targeting children as young as four years old.
West Midlands Sees Dramatic Surge in Offences
Disturbing data provided by police forces across Staffordshire, Warwickshire, West Midlands and West Mercia shows that 391 sexual communication with a child offences were recorded last year. This represents a shocking increase of more than double the figure recorded when the legislation first came into force during 2017/18, which stood at 194 offences.
The national picture is equally concerning, with figures from 44 UK police forces revealing 7,263 sexual communication with a child offences were recorded last year - the highest number since records began. Where forces could be directly compared, the number of crimes had almost doubled (99%) since the offence first became law.
Social Media Platforms Under Scrutiny
Analysis of the data shows that social media platforms are being extensively exploited by online predators. The NSPCC revealed that of the 2,111 offences where police could identify the platform used, 40% took place on Snapchat.
Other platforms frequently used by groomers included WhatsApp (9%) and Facebook and Instagram (9%). The charity emphasised that the real number of crimes is likely significantly higher due to abuse occurring in private digital spaces where detection is more challenging.
Where gender was known, 80% of targeted children were girls, highlighting particular vulnerability. Nationally, the youngest victim identified was a four-year-old boy.
Devastating Impact on Young Lives
The human cost of these crimes is profound, as illustrated by harrowing accounts shared with support services. A 14-year-old girl who contacted Childline described her experience: "I feel so insecure all the time, so when this guy I've met online, who's a few years older, started flirting with me, that made me feel so special. He seemed to care, but now he's insisting I send him nudes."
She added: "I feel like I've been tricked but I'm afraid what he might do if I just block him. I can't control how anxious this makes me feel."
A parent from the Midlands who contacted the NSPCC Helpline expressed their shock upon discovering their daughter had been targeted: "I'm really shaken about everything going on, I can't believe I didn't realise my daughter was being groomed online. She hasn't stopped apologising for sending the pictures, I can't tell her enough times that it wasn't her fault."
Call for Comprehensive Safety Measures
The NSPCC is publishing new research outlining solutions to prevent, detect and disrupt grooming in private messaging environments. The research indicates that safety measures must be implemented in tandem to be effective, working together to prevent harm throughout the grooming cycle.
Key recommendations include:
- Implementing tools on children's devices that can scan for nude images and identify child sexual abuse material before it's shared
- Using metadata analysis to spot suspicious patterns without reading private messages, flagging behaviours such as adults repeatedly contacting numerous children or creating fake profiles
- Creating barriers for adult profiles engaging with children on social media platforms, including restrictions on search capabilities and contact limits
Chris Sherwood, NSPCC chief executive, stated: "It's deeply alarming that online grooming crimes have reached a record high across the UK, taking place on the very platforms children use every day. At Childline, we hear first-hand how grooming can devastate young lives. The trauma doesn't end when the messages stop, it can leave children battling anxiety, depression, and shame for years."
He urged technology companies to act immediately, emphasising that children's safety must be built into platform design from the outset rather than treated as an afterthought. The charity is calling on tech companies, Ofcom, and the Government to commit to using every available tool to stop perpetrators.