On Air Now

Jack Burke

10:00am - 1:00pm

Sexual predators' new business model is spreading - and humans are needed to catch them

Wednesday, 22 April 2026 18:47

By Mickey Carroll, science and technology reporter

For a long time, images and videos of child abuse were hidden away in the darkest corners of the internet, intentionally hard to find.

That's changing. Now, fully commercial sites are springing up all over the open web.

The Internet Watch Foundation, the organisation charged with removing child sexual abuse material (CSAM) from the internet, has seen the number of commercial sites double in the past year alone.

Some of them are hidden behind apparently innocent website fronts, others are just sitting in the open, just a few clicks away from your social media feeds.

The criminals running these sites aren't selling access to one or two videos of 'category A' material - the worst level of designation content assigned by police.

They're encouraging users to download - and pay for - terabytes of content at a time. But like any business, they need a marketing strategy. They've chosen word of mouth.

"[They're using] 'refer-a-friend' schemes whereby if you view the content and you want more, you can spread that link around your social media accounts, and then the more clicks that content gets," according to Mabel, an anonymous analyst at the IWF.

"That's new. We never used to see that at all."

Mabel is one of the few people in the world who is legally allowed to hunt down and remove CSAM from the internet. She's also a grandmother.

She added: "I worry that my grandchildren will be presented with these sites in their feeds on their social media, not realise what they are and click on them."

Nearly every refer-a-friend scheme was reported to the IWF by a member of the public, rather than a trained analyst.

That worries analysts like Mabel because it suggests ordinary people are now stumbling across this extreme abuse material in a way they never have before.

"I come into work every day and I know what I'm going to see. I'm expecting to see the content that I see on the internet," she said.

"But can you imagine if you turned on your phone, turned on the computer, and within a few clicks you saw category A content? You can't unsee that once you've seen it."

Read more from Sky News:
Investigation into child sex abuse on Telegram
Survivor of online child abuse shares story
Sex offenders exposed to abuse as children

A lot of tech firms, like social media companies, have recognised the harm that seeing such extreme content can do to their employees. Social media moderators are routinely exposed to CSAM, extreme violence and death. It has an impact.

Two years ago, moderators from Meta began legal action against the company after more than 140 of them were diagnosed with severe PTSD.

Other major social media sites like TikTok are also facing legal action over their treatment of moderators and, as a result, many companies are turning to AI to deal with the majority of extreme content.

They say it will help ease the severe mental load for their human workers.

Even the Metropolitan Police announced last week that it will begin exploring how AI could help the force analyse large volumes of CSAM, leaving officers free to "focus human expertise where it is needed most".

So what about the IWF, where analysts are dealing with more content than ever before? They've seen a 6% increase in the amount of CSAM online in the last year alone.

"Artificial intelligence tools are a supplement, right?" IWF chief executive Kerry Smith said.

"They're a supplement to human intelligence. They aren't a replacement."

She believes her human analysts are worth the cost of the mandatory monthly counselling, stringent recruitment process and ongoing psychological care, because of their "offline understanding" of the internet's underbelly.

"[They have an] understanding of how abuse occurs, what exploitation looks like, how you find particular indicators within those images and within those videos that can help identify an individual," Ms Smith said.

"So I think artificial intelligence is a weapon that we could use to prevent online child sexual abuse and exploitation, but it's not a replacement for human intelligence and human insight."

Sky News

(c) Sky News 2026: Sexual predators' new business model is spreading - and humans are needed to catch them

More from Technology

Schedule

Today's Weather

  • Chelmsford

    Sunny

    High: 16°C | Low: 2°C

  • Southend

    Sunny intervals

    High: 14°C | Low: 4°C

  • Colchester

    Sunny

    High: 16°C | Low: 3°C

  • Harlow

    Sunny

    High: 19°C | Low: 3°C