“Every day of my life, I live in constant fear that someone will see my pictures and recognize me and that I will be humiliated all over again. It hurts me to know someone is looking at them — at me — when I was just a little girl being abused for the camera. I did not choose to be there but now I am there forever in pictures that people are using to do sick things. I want it all erased. I want it all stopped. But I am powerless to stop it just like I was powerless to stop my uncle.”
This quote is part of a victim impact statement which clearly portrays the long-lasting impact of a child’s sexual exploitation. This crime is exponentially compounded when the sexual abuse is recorded via digital images/videos and shared online. Many of these child victims, just like Amy in the victim impact statement above, are now adults, living with the frustrating realization that their sexual abuse material is traded online regularly among those with a sexual interest in children.
“I want it all erased. I want it all stopped”.
Can this be done? Does that technology exists which can find a single image amongst millions of others? If such technology does exist, can it be put to scale against the billions of images stored online daily? How can child sexual abuse images be isolated online while still respecting the privacy of legitimate online users? This is a complex issue to say the least but as we learned, it can be done through the use of very precise technology that utilizes the power of hash values.
It started in 2006 when the Technology Coalition and the National Centre for Missing and Exploited Children (NCMEC) teamed up to help address this growing problem of child sexual abuse images online and an idea was born. It started out simple. NCMEC could provide hash values derived from child sexual abuse images that were previously reported to its CyberTipline by Electronic Service Providers (ESPs). ESPs could voluntarily participate and compare the NCMEC hash values against hash values of images that have already been uploaded by users to their platform. Images matching a hash value derived from child sexual abuse images could be removed by the ESP and reported to the CyberTipline. It was an exciting concept that offered a technological opportunity to reduce the proliferation of child sexual abuse material online. While exciting, everyone was cautiously optimistic as we knew this would amount to looking for a very tiny needle within a very large haystack.
As we began to carefully walk down this path, we learned of limitations with hashing technologies. Undeterred we presented this challenge to several US-based technology companies such as Google, Facebook and Microsoft. Not surprisingly, they rose to the challenge. A more robust imaging matching technology, called PhotoDNA was developed. Since then PhotoDNA has become one of the most significant tools ever created in the reduction of child sexual abuse material online. Every year more and more US-based ESPs have adopted the use of PhotoDNA increasing its success.
In the US, proactive scanning for child sexual abuse images is voluntary, but the reporting aspect is not. When a US-based ESP becomes aware of child pornography on their network, they are required by law to submit a report to NCMEC’s CyberTipline. Fast forward to 2017, the voluntary implementation of child sexual abuse hash value scanning has continued to grow, both in the number of companies utilizing this technology and the size of the hash value list. In 2017, NCMEC’s CyberTipline received more than 10 million reports and we have provided more than 500,000 child sexual abuse hash values to the companies! 94% of the CyberTipline reports received in 2017 involved uploaders of apparent child pornography from outside the United States, including some in India. This increased volume has evolved NCMEC into a global clearinghouse of information that make CyberTipline reports available to local, state and federal law enforcement in the US as well as more than 100 national police forces around the world, including Europol and Interpol.
The widespread adoption
of these technology tools speaks to the extremely precise nature in which technology companies can voluntarily identify and remove child sexual abuse material from the Internet. This surgical-like precision considers the need to quickly and proactively find the heinous illegal material, while balancing the privacy concerns of legitimate Internet users. However, we can’t stop there.
In 2017, a groundbreaking tool was developed by the Canadian Centre for Child Protection to combat the growing proliferation of child sexual abuse material on the Internet. The automated tool is called Project Arachnid, which is a spider crawling the open Internet and the dark-web for web pages containing child sexual abuse hash values. NCMEC has partnered with the Canadian Centre for Child Protection on Project Arachnid and is providing hashes values and other technical support.
Over only a six-week period in 2017, Project Arachnid:
- Processed over 230 million web pages
- Detected over 5.1 million unique web pages hosting child sexual abuse material
- Detected over 40,000 unique images of child sexual abuse
What is amazing about this tool is that when child sexual abuse material is identified, a notice is sent to the hosting provider to request its immediate removal. So not only are technology companies armed with the tools to ensure they are not hosting child sexual abuse material, there is a cutting-edge tool that can spider the Internet, find those images and send notices to those hosting providers that may not be voluntarily taking steps to find and remove child sexual abuse content.
In the 18 years that I’ve devoted my career towards child protection at the National Centre for Missing and Exploited Children, I can confidently say that hash values of child sexual abuse material are the way forward to eliminate the material from the Internet. Together NGOs, law enforcement and Internet companies have made a huge difference in the lives of children around the world and I am proud to have been a part of making these tools a reality. NCMEC is enthusiastic about discussions that have taken place and the interest that has been expressed to make CyberTipline reports available to law enforcement in India to expand and further address the issue of child sexual abuse images online.
For more information about the National Centre for Missing and Exploited Children, please visit us at www.missingkids.org.
By John Shehan
Exploited Child Division
National Centre For Missing And Exploited Children