How AI is Helping Law Enforcement Meet the Challenges of Online Child Sexual Exploitation
When it comes to artificial intelligence (AI), the talk often focuses on business growth or revenue. Today, we’re looking at something altogether more meaningful and serious; Online Child Sexual Exploitation(OCSE).
For Chris Saunders, Principal Consultant at Daemon, AI is more than a corporate tool. It’s about helping people in some of the toughest circumstances imaginable and giving frontline law enforcement officers the support and tools they need to protect victims of online child sexual exploitation more effectively.
Chris has worked across defence, aerospace, nuclear energy, and space exploration. Yet in recent years, his focus has shifted closer to home, working with the UK government and law enforcement agencies, both nationally and internationally; he harnessed AI to support OCSE investigations that would otherwise be overwhelming, using manual (non-AI) methods. The result was the identification, location, and protection of child victims earlier than could be offered by conventional investigative means and in much greater volumes. That’s good for the victims involved and society in general.
Bringing order to the unmanageable
One of the biggest challenges facing OCSE investigations teams is sheer scale. Officers are often tasked with reviewing mountains of digital material, far more than any human could reasonably process. Case backlogs become unmanageable, and the backlog increases on a daily basis. The exponential explosion in OCSE activity during the COVID pandemic greatly increased this backlog. That’s where AI came in, Chris explains,
‘AI has been fantastic in automatically processing tens of millions of digital media items much quicker than humans can do it and more accurately in some cases’.
What this means in practice is that investigators can quickly spot links across multiple cases, identify repeat victims, and combine evidence that may be scattered across different geographical jurisdictions and multiple media items.
The difference this makes is not abstract. Traditionally, investigations that could drag on for years can now be accelerated to weeks. This means that victims can be identified and supported sooner, and officers can focus their energy where it matters most. As Chris puts it,
‘AI can process media items and extract meaningful intelligence in a matter of weeks; it would take years for law enforcement to achieve the same results using traditional methods’.
Protecting officers as well as victims
The work of child protection can be emotionally devastating. Investigators are frequently exposed to deeply distressing material, and while muting audio or avoiding content can protect their wellbeing, it also risks missing vital evidential detail.
One of the solutions Chris developed uses large-scale Deep Learning technology to enable officers to read content contained in media, rather than listening directly. This allows them to capture critical intelligence without having to relive the trauma of the material itself, in slow-time. It’s a striking example of AI not just supporting victims, but also protecting the people who dedicate their lives to protecting them.
Another example uses AI Classification techniques to rapidly categorise illegal activity that is present in media items. Previously, a case with 10,000 images would typically take up to 1 week to review and categorise. Now, cases like this can be reviewed in an hour. This helps investigators to prioritise cases and to make quick decisions in protecting children. Chris also used AI to process immense volumes of media alongside cutting-edge capabilities such as facial/object matching and school uniform identification to gather intelligence.
Chris’s utilisation of AI did much more than protect children, though. These same AI tools were used to identify suspects, reduce burden on investigating officers, filter evidence, support sentencing of offenders and, working with the Internet Watch Foundation, remove illegal content from the internet.
Responsibility first, technology second
For all its promise, AI isn’t a plug-and-play solution. As Chris is quick to point out, deploying it responsibly is just as important as building it. ‘Once you have deployed your AI solution, you need to ensure that it continues to provide you with accurate information into the future. That means monitoring and maintaining the AI algorithms, post-deployment, and detecting when they start to provide degraded accuracy’.
The risks of cutting corners are serious. AI tools must be trained on ethically sourced data, designed around strong security, and constantly monitored to ensure they remain accurate. In child protection, where trust and accountability are paramount, these safeguards are not optional: they are essential.
The conversation about AI often focuses on replacement: what jobs it will take, what human roles will disappear. But Chris’s work highlights a different truth. AI is at its best when it works alongside people, not instead of them. AI frees officers to concentrate on what only humans can do: support victims and improve decision-making across some of the hardest cases in society. AI achieves what humans cannot - process high volumes of data in record time and provide critical intelligence which could otherwise be missed.
The impact of this approach is already being felt. And while AI won’t solve every problem, in the right hands and with the right oversight, it can give those on the front line the tools they need to do their jobs better, faster, and with greater humanity.
At Daemon, we care about deploying AI in the right way. We’re working at the cutting edge of AI and have an AI-first approach across multiple industries. Talk to us today:
