BEING online is a huge part of children’s lives today, from gaming to social media apps. But these platforms represent a very real and ever-present danger if not used safely.

New data released by the NSPCC shows that online grooming crimes are reaching record levels across the UK. Figures from police in our region show 477 Sexual Communication with a Child offences were recorded in 2023/24 - an increase of 60 per cent on the figures in 2017/18 (297) when the offence first came into force.

Meanwhile, the number of online grooming crimes recorded by police forces across the UK has increased by 89per cent in six years (since 2017/18), with more than 7,000 offences recorded in 2023/24.

We’re issuing these findings a year on from the Online Safety Act being passed. They show that Snapchat was the most popular platform used by perpetrators to target children online last year. The messaging app was used in almost half of grooming cases across the UK, where the means of communication was disclosed. Platforms operated by Meta were particularly popular among offenders. They featured in over a quarter of UK recorded cases, where a platform was known, with WhatsApp (12per cent), Facebook and Messenger (10per cent), and Instagram (six per cent) all being used to abuse children. Facebook, WhatsApp, Snapchat, Instagram and TikTok were all used in cross-platform grooming. This is where the first point of contact between children and would-be offenders is on the open web. This can include social media chat apps, video games and messaging apps on consoles, dating sites and chatrooms. Children are then encouraged to continue communication on private and encrypted messaging platforms where abuse can take place undetected. Girls are most likely to fall victim to this form of abuse. They make up 81per cent of total UK recorded cases where gender was known in 2023/24. The youngest victim of online grooming in 2023/24 was a five-year-old boy.

To address this situation head on we’re urging Ofcom to strengthen the rules social media platforms must follow to tackle child sexual abuse on their products. Ofcom puts too much focus on acting after harm has taken place, rather than being proactive and ensuring design features of social media apps aren’t contributing to abuse in the first place.

It’s been a year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children. Ofcom must be ambitious in its regulation and significantly strengthen its approach to make companies address how their products are exploited by offenders. Much abuse is taking place in private messaging which is why we need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.

* The NSPCC's safety hub (nspcc.org.uk/keeping-children-safe/online-safety) offers advice for parents to help keep children safe.