top of page
Writer's pictureSigurður Ragnarsson

Videntifier Enhances Victim Identification at National Center for Missing & Exploited Children

Updated: Oct 17, 2023

Helping create a light in the darkest corners of the web


The National Center for Missing & Exploited Children (NCMEC) is dedicated to helping rescue children from dangerous and sexually exploitative circumstances. Learn how we partnered with NCMEC’s technology department to provide a visual fingerprinting solution that can help stop the spread of child sexual abuse material online and aid law enforcement in removing children from harm.


 



Included in this article:


Who is NCMEC?


Since its establishment in 1984, the National Center for Missing & Exploited Children (NCMEC) has made an immense effort to help rescue and protect children in dangerous and exploitative situations. A nonprofit organization, NCMEC is dedicated to serving as a voice for children who cannot defend themselves against abusers, as well as helping parents bring missing or lost children home. The organization is best known for creating age progression photos for long-term missing children’s cases and ensuring the faces of missing children are displayed publicly to be recognized and brought to safety. Since the nonprofit began, they have provided a national toll-free hotline, 1-800-THE-LOST, receiving more than 5 million calls to date.


NCMEC and child sexual abuse online


Adapting to the digital age and taking advantage of its capabilities, NCMEC created the CyberTipline website in 1998 to give the public a place to easily report child sexual abuse and exploitation online.


Via the CyberTipline, the organization’s victim identification team reviews millions of reports of child sexual abuse material (CSAM) images and videos every week, and is in constant communication with law enforcement to help them in their efforts to quell exploitation, rescue victims, and bring abusers to justice. This in mind, the ideal outcome here goes as follows: NCMEC identifies specific pieces of information within the reported imagery to help verify the depicted child’s identity and location, and then passes that information over to law enforcement to put an end to the child’s victimization and remove them from harm.


However, while NCMEC’s dedicated analysts do their absolute best to effectively manage the overwhelming amount of CSAM reported each day, manual review of abusive content proved to be a less than ideal method of addressing ongoing abuse. That’s when NCMEC reached out to the Videntifier team in need of a solution that could help them to better aid law enforcement in rescuing children from dire exploitative circumstances.


The challenge


Confronting recycled child sexual abuse content


Child sexual abuse material (CSAM) is posted online every day, and has, unfortunately, been persistently infiltrating various corners of the web for over two decades. Despite online platforms doing their best to remove the abusive content, it is often reuploaded elsewhere, causing a cycle of revictimization. In turn, among the millions of CSAM videos and images reported each month, a fairly large chunk is often found to be duplicated content that already exists within NCMEC’s database, making it difficult to address new reports of exploitation.


Exploited Children Division Director Bettye Allwang remarks, “In my fifteen years of working here, I have seen the same videos this week as I did in 2006.”


And so, relative to the amount of reported material coming in, NCMEC’s small staff can only do so much between three regional offices and one headquarters, which have received over a million hours of videos since CyperTipline’s inception, a large chunk of which is inundated with duplicate content.


The need for robust video identification: Finding needles in a haystack


“At the end of the day, our team is looking for nuggets of information that are going to help law enforcement identify child sexual abuse victims. It’s looking for a needle in a haystack. And time is critical when these children have ongoing abuse taking place.”

John Shehan, VP Exploited Children Division



That said, receiving so many previously reported videos was creating a heavier workload for little return, as reports with new material—i.e., new leads indicating cases of abuse and exploitation—weren’t being addressed with balanced priority.


What’s more, on top of the frequency of duplicated videos, these duplicates are often modified or altered—whether it be through editing, cropping, filtering, downscaling, rotation, etc.—to avoid detection, thus making it that much more difficult for NCMEC’s reviewers to separate known CSAM from unknown.


With the ever-persistent influx of reports, NCMEC realized it was time to look into automating technology that would give reviewers the upper hand in the fight against CSAM, as well as relieve some of the strain and trauma caused by constantly being subjected to the same disturbing content again and again.


The Solution


Videntifier enhances NCMEC’s ability to stop ongoing abuse


NCMEC reached out to Videntifier's Sigurdur Ragnarsson, CEO, to help strengthen their victim identification process. After a short conversation, we quickly understood the sensitivity and urgency of NCMEC’s dilemma. As soon as we possibly could, we got to work implementing a video identification tool that would help NCMEC build a system that could streamline the detection of known CSAM reported to CyberTipline and dramatically improve NCMEC’s ability to address new reports in a timely manner.


“NCMEC received over 44 million videos of suspected child sexual exploitation in 2021. With our technology, it is possible to find content in our database in less than 1 second, dramatically cutting the staff’s time in identifying potential child victims.​”

Sigurdur Ragnarsson, CEO at Videntifier Technologies


Here’s what our video identification tool delivered:


Faster Detection


Identifying a video within NCMEC’s collection of CSAM reports now only takes a few seconds—as compared to the painstaking hours of manual review—and categorizes the media based on its manual review status. Additionally, our software interoperates with the CyberTipline database, automatically identifying a video if it has a previously reported match, regardless of whether the content has been altered.


Thwart modified CSAM attempting to escape detection


Our technology is unique in comparison to other video identification tools as it is able to ‘see through’ common modifications of duplicated material, such cropping, filtering, downscaling, and rotation. As mentioned above, these types of modifications make identifying content almost impossible when fed through more primitive versions of video identification tech. When it comes to stopping the spread of CSAM, it is important to us that our technology is able to thwart abuser’s efforts to continue the distribution of harmful content by altering it to avoid detection.


Improve NCMEC’s ability to help law enforcement rescue children


With an automated process, the analysts can now focus on reports of child sexual exploitation in which the victims have yet to be identified. Before, analysts assessed new reports of previously reviewed media only to later find out that the information had already passed on to law enforcement, resulting in wasted time, resources, and did close to nothing to help victims of ongoing abuse. Since our video identification tool is able to help reviewers quickly distinguish between known and unknown material, NCMEC now sends the information to law enforcement faster than ever before, hastening the rescue of children from current abuse.


Reduces reviewer’s exposure to traumatizing imagery


NCMEC does not have an extensive staff, so sifting through millions of videos is a looming and impossible task, especially given the disturbing nature of the content.

With Videntifier, NCMEC no longer requires manual review of known media, reducing the need to expose their victim identification team to the same traumatizing imagery again and again.


Breaking down the tech: How NCMEC uses Videntifier video identification


Utilize advanced hash technology: Hash technology is a method of mapping data against a fixed value. When a video or image’s data is hashed, the same value is assigned to duplicates of that image, making identification of duplicated content easy to match. While simpler, less robust versions of hash technology can only identify exact duplicates, our identification tool is designed to identify identical content regardless of whether the image has been altered, giving NCMEC an extreme advantage in their ability to match reports with known CSAM and isolate new leads.


Assess video or image for distinct visual fingerprint: When NCMEC receives reported abusive imagery, our identification tool assesses the video’s distinct visual fingerprint and finds its match in the NCMEC database—all while simultaneously scanning for visually similar media. Needing only a single frame of a video or image, our technology can accurately search the given piece of media against NCMEC’s database to match it to material that has already come through the CyberTipline, drastically reducing the amount of bandwidth spent manually reviewing reports.


Save, categorize, label, and report to law enforcement: After a specific piece of CSAM is searched against the database and identified, the information is then saved in the system’s query history for future reference, and from there gets categorized as either known or unknown media. Easily accessible and labeled accordingly, NCMEC staff gathers information from the newest reported material to send to law enforcement, while the recycled CSAM is dealt with according to its own status.


Notify online service provider of harbored abusive content: Finally, NCMEC alerts the online service provider that, usually unknowingly and unintentionally, is harboring the revictimizing post, giving them the signal to exercise the appropriate actions for ensuring the abusive content is taken down and stays down.




Illustrating NCMEC’s improvement


Before partnering with Videntifier, NCMEC adopted strict hash matching, a more primitive approach to video and image identification, but was only able to identify exact duplicates, reducing the total identified unique videos to 12.9 million. This was a big reduction, but still too much content to put on manual review. After partnering with Videntifier, NCMEC was able to identify a wide breadth altered content, and therefore more duplicated videos, reducing the count of unique videos to 5.1 million.

Still a large number, but a significant reduction.



Without Video identification tech:

44.8 Million CSAM Videos (unique and duplicated) reported to NCMEC in 2021






I

l







I


I


v


Using primitive video identification tech:

Reduced to 12.9 million unique videos


l







I


I


v


With Videntifier video identification:

Reduced to 5.1 million unique videos




The Result


Providing NCMEC with a capable and effective video identification solution


Still working together on a regular basis, the Videntifier and NCMEC teams continue to optimize the hotline’s video identification capabilities to meet their unique needs. Despite being a small company, we can proudly—and with much gratitude—report that we’ve made an immense impact on NCMEC’s efficiency. Our highly accurate video identification tool supports and streamlines NCMEC’s mission to protect children worldwide.


“[Videntifier] was the most technically capable solution that we looked at for video comparison . . . and has, as a company, been a wonderful NCMEC partner, fully engaged, an almost integral part of our technology team.”

Derek Bezy, VP of NCMEC’s Technology Division


To sum up, a few of the project’s results specifically:

  • Streamlined and automated NCMEC’s video identification.

  • Hastened the rescue of exploited children from ongoing abuse and the incarceration of perpetrators.

  • Reduced the exposure of traumatizing imagery to NCMEC staff.

  • Utilized advanced video identification technology to protect children.

  • Decreased manual review of known CSAM

  • Increased time and resources to focus on new CSAM reports.


Moving forward: Further optimization of NCMEC’s system and partnering with Point de Contact

Having enhanced their ability to identify CSAM videos, we’re currently working with the NCMEC team to implement the same system for their image collection, which will help the nonprofit perform sharper analysis across all reported imagery and indicate ongoing abuse across a broader scope of media.


Additionally, NCMEC has also partnered with Point de Contact on sharing hashes for identified CSAM. Like NCMEC, Point de Contact is a hotline that allows users to report potentially illicit content that they may come across while browsing the web. Powered by Videntifier technology, the efforts and expertise of our three companies are working to build a fast and thoroughly efficacious ecosystem for stopping the spread of abusive content and rescuing its victims from harm.


A light in the darkest corners of the web


“For us, this is the most important work that we do.”

Sigurdur Ragnarsson, CEO at Videntifier Technologies


While the persistent influx of CSAM being uploaded online may seem nearly unstoppable, we believe that the work being done by organizations like NCMEC and Point de Contact is the most important thing for technology companies like Videntifier to focus on. Though already making great strides in eliminating the distribution of CSAM and stopping the harm that comes to these innocent children, we’re continually working on making the Videntifier system stronger, smarter, and more sophisticated in its ability to quell these horrific crimes. By continually working with NCMEC, the Videntifier team is firm in its dedication to becoming an even brighter light for children exploited by the darkest corners of the web.






















Comments


bottom of page