technology

Snapchat most-used app for grooming, says NSPCC


Getty Images Thumbs above a lit up phone screenGetty Images

The messaging app Snapchat is the most widely-used platform for online grooming, according to police figures supplied to the children’s charity the NSPCC.

More than 7,000 Sexual Communication with a Child offences were recorded across the UK in the year to March 2024 – the highest number since the offence was created.

Snapchat made up nearly half of the 1,824 cases where the specific platform used for the grooming was recorded by the police.

The NSPCC said it showed society was “still waiting for tech companies to make their platforms safe for children.”

Snapchat told the BBC it had “zero tolerance” of the sexual exploitation of young people, and had extra safety measures in place for teens and their parents.

Becky Riggs, the National Police Chief’s Council lead for child protection, described the data as “shocking.”

“It is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow,” she added.

Groomed at the age of 8

The gender of the victims of grooming offences was not always recorded by police, but of the cases where it was known, four in five victims were girls.

Nicki – whose real name the BBC is not using – was eight when she was messaged on a gaming app by a groomer who encouraged her to go on to Snapchat for a conversation.

Readers Also Like:  Shop the 10 best Amazon tech deals on Apple, Fitbit, SanDisk and Eufy

“I don’t need to explain details, but anything that you can imagine happening happened in those conversation – videos, pictures. Requests of certain material from Nicki, etcetera,” her mother, who the BBC is calling Sarah, explained.

She then created a fake Snapchat profile pretending to be her daughter and the man messaged – at which point she contacted the police.

She now checks her daughter’s devices and messages on a weekly basis, despite her daughter objecting.

“It’s my responsibility as mum to ensure she is safe,” she told the BBC.

She said parents “cannot rely” on apps and games to do that job for them.

‘Problems with the design of Snapchat’

Snapchat is one of the smaller social media platforms in the UK – but is very popular with children and teenagers.

That is “something that adults are likely to exploit when they’re looking to groom children,” says Rani Govender, child safety online policy manager at the NSPCC.

But Ms Govender says there are also “problems with the design of Snapchat which are also putting children at risk.”

Messages and images on Snapchat disappear after 24 hours – making incriminating behaviour harder to track – and senders also know if the recipient has screengrabbed a message.

Ms Govender says the NSPCC hears directly from children who single out Snapchat as a concern.

“When they make a report [on Snapchat], this isn’t listened to, and that they’re able to see extreme and violent content on the app as well,” she told the BBC.

Readers Also Like:  Onewheel: Snowboard Shop halts sales after four deaths in US

A Snapchat spokesperson told the BBC the sexual exploitation of young people was “horrific.”

“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities,” they added.

Record offending

The instances of recording grooming has been increasing since the offence of Sexual Communication with a Child came into force in 2017, reaching a new record high of 7,062 this year.

Of the 1,824 cases where the platform was known in the last year, 48% were recorded on Snapchat.

Reported grooming offences on WhatsApp rose slightly in the past year. On Instagram and Facebook, known cases have fallen over recent years, according to the figures. All three platforms are owned by Meta.

WhatsApp told the BBC it has “robust safety measures” in place to protect people on its app.

Jess Phillips, minister for safeguarding and violence against women and girls, said social media companies “have a responsibility to stop this vile abuse from happening on their platforms”.

In a statement, she added: “Under the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services or face significant fines.”

The Online Safety Act includes a legal requirement for tech platforms to keep children safe.

From December, big tech firms will have to publish their risk assessments on illegal harms on their platforms.

Readers Also Like:  ‘Son of Concorde’ that can fly to New York in 3.5hrs reaches new heights

Media regulator Ofcom, which will enforce those rules, said: “Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.

“We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.