Last week 348 people were arrested in Canada – and 386 young kids rescued – in one of the largest child sex investigations ever seen. It defies belief that anyone would sexually abuse children, especially teachers and doctors entrusted with their care.
But this awful case highlights the depths to which humanity can sink.
And while society will never wholly eliminate such depravity, we should do everything in our power to protect children from harm.
We actively remove child sexual abuse imagery from our services and immediately report abuse to the authorities. This evidence is regularly used to prosecute and convict criminals.
But as David Cameron said in a speech this summer, there's always more that can be done.
We've listened, and in the last three months put more than 200 people to work developing new, state-of-the-art technology to tackle the problem.
We've listened, and in the last three months put more than 200 people to work developing new, state-of-the-art technology to tackle the problem.
Cleaning up search: We've fine tuned Google Search to prevent links to child sexual abuse material from appearing in our results.
While no algorithm is perfect – and Google cannot prevent paedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids.
As important, we will soon roll out these changes in more than 150 languages, so the impact will be truly global.
Deterrence: We're now showing warnings – from both Google and charities – at the top of our search results for more than 13,000 queries. These alerts make clear that child sexual abuse is illegal and offer advice on where to get help.
Detection and removal: There's no quick technical fix when it comes to detecting child sexual abuse imagery.
This is because computers can't reliably distinguish between innocent pictures of kids at bathtime and genuine abuse. So we always need to have a person review the images.
Once that is done – and we know the pictures are illegal – each image is given a unique digital fingerprint.
U-turn: Internet search results linked to child abuse are to be blocked across the world by Google
This enables our computers to identify those pictures whenever they appear on our systems. And Microsoft deserves a lot of credit for developing and sharing its picture detection technology.
But paedophiles are increasingly filming their crimes. So our engineers at YouTube have created a new technology to identify these videos.
We're already testing it at Google, and in the new year we hope to make it available to other internet companies and child safety organisations.
Technical expertise: There are many organisations working to fight the sexual exploitation of kids online – and we want to ensure they have the best technical support.
So Google plans to second computer engineers to both the Internet Watch Foundation (IWF) here in Britain and the US National Center for Missing and Exploited Children (NCMEC). We also plan to fund internships for other engineers at these organisations.
This will help the IWF and NCMEC stay one step ahead. The sexual abuse of children is a global challenge, and success depends on everyone working together – law enforcement, internet companies and charities.
We welcome the lead taken by the British Government, and hope that the technologies developed (and shared) by our industry will make a real difference in the fight against this terrible crime.
No comments:
Post a Comment