A tip line set up 26 years ago to combat online child exploitation has failed to live up to its potential and needs technological and other improvements to help law enforcement go after abusers and rescue victims, a new report from the Stanford Internet Observatory.
Solutions are also urgently needed for what the researchers describe as an “immensely valuable” service, as new artificial intelligence technology threatens to worsen its problems.
“Almost certainly, CyberTipline will be flooded with highly realistic-looking AI content in the coming years, which will make it even more difficult for law enforcement to identify real children in need of rescue,” said researcher Shelby Grossman. an author of the report.
The agency was created by Congress as the main line of defense for children being exploited online. Under the law, tech companies must report any child sexual abuse material found on their platforms to the system, which is run by the National Center for Missing and Exploited Children. After receiving the reports, NCMEC works to find the people who sent or received the material – and, if possible, the victims. These reports are then sent to the police.
While the sheer volume of CyberTipline reports is overwhelming law enforcement, researchers say volume is just one of many problems at the heart of the system. For example, many of the reports sent out by tech companies – such as Google, Amazon and Meta – miss important details, such as sufficient information about the identity of the offender, the report said. This makes it difficult for law enforcement to know which reports to prioritize.
“There are significant problems with the entire system right now and those cracks will become fissures in a world where AI is generating brand new CSAM,” said Alex Stamos, using the initials for child sexual abuse material. Stamos is a Stanford lecturer and cybersecurity expert.
The system is technologically behind and plagued by an ongoing challenge among government and nonprofit technology platforms: the lack of highly skilled engineers, who can command much higher salaries in the tech industry. Sometimes those employees are even poached by the same companies that submit the reports.
Then there are the legal restrictions. According to the report, court decisions have caused NCMEC staff to stop vetting some files (for example, if they are not publicly available) before sending them to police. Many law enforcement officials believe that they need a search warrant to access such images, delaying the process. Sometimes multiple warrants or subpoenas are needed to identify the same perpetrator.
The system can also be easily distracted. The report shows that NCMEC recently reached a milestone of one million reports in one day due to a meme that spread across multiple platforms – which some people thought was funny and others shared out of outrage.
“That day led them to make some changes,” Stamos said. “It took them weeks to clear that backlog” by making it easier to put those images together in certain ways.
The CyberTipline received more than 36 million reports in 2023, almost all of them from online platforms. Facebook, Instagram and Google were the companies that submitted the highest number of reports. The total number of reports has increased dramatically.
Nearly half of the tips sent last year were actionable, meaning NCMEC and law enforcement agencies were able to take action. The rest did not have enough information or the image had been reported many times before.
Hundreds of reports involved the same perpetrator, many of which contained multiple images or videos. About 92% of reports filed in 2023 involved countries outside the US, a major shift from 2008 when the majority involved victims or perpetrators within the US
Some are false alarms. “It drives law enforcement crazy when they get these reports that they think are absolutely adults,” Grossman told reporters. “But the system encourages platforms to be very conservative or report potentially cross-border content, because if it turns out to be CSAM and they are aware of it and don’t report it, they could face fines.”
A relatively simple fix suggested in the report would improve the way tech platforms label what they report, to distinguish between widely shared memes and something that deserves further investigation.
The Stanford researchers interviewed 66 people involved with the CyberTipLine, ranging from law enforcement officers to NCMEC staff to online platform workers. Many say they have been voicing their concerns for years.
The NCMEC said it looks forward to “exploring the recommendations internally and with key stakeholders.”
“Over the years, the complexity of the reports and the severity of crimes against children continue to evolve. Therefore, leveraging emerging technology solutions throughout the CyberTipline process will help protect more children and hold offenders accountable,” the report said in a statement.
Other findings from the report:
— The CyberTipline reporting form does not have a special field for submitting chat-related material, such as sextortion messages. The FBI recently warned of a “massive increase” in sextortion cases targeting children – including financial sextortion, in which someone threatens to release compromising images unless the victim pays.
– Police detectives told Stanford researchers they are having a hard time convincing their senior leaders to prioritize these crimes. Even after writing detailed descriptions of the images to emphasize their seriousness, “often their higher-ups don’t even want to read their description,” Grossman said. “They shudder when they read it and don’t really want to think about it.”
– Many law enforcement officials said they were unable to fully investigate all reports due to time and resource constraints. One detective can be responsible for 2,000 reports per year.
– Outside the US, especially in poorer countries, the problems surrounding reports of child exploitation are particularly serious. Law enforcement agencies may not have reliable internet connections, “decent computers,” or even gasoline for cars to execute search warrants.
– Pending legislation passed by the US Senate in December, online platforms would be required to report child sex trafficking and online enticement to the CyberTipline and give law enforcement more time to investigate child sexual exploitation. Currently, the tip line does not provide easy ways to report suspected sex trafficking.
While some advocates have proposed more intrusive surveillance laws to catch abusers, Stamos, the former chief security officer at Facebook, said they should try simpler solutions first.
“There is no need to violate user privacy if you want to put more pedophiles in prison. They’re sitting there,” Stamos said. “The system doesn’t work very well at taking the information that currently exists and then turning it into prosecutions.”