By BARBARA ORTUTAY and MATT O’BRIEN (AP Technology Writers)
A tipline created 26 years ago to fight online child abuse has not reached its full potential and needs technological upgrades to help law enforcement catch abusers and save victims, according to a new report from the Stanford Internet Observatory.
The researchers emphasize the necessity of urgent improvements to what they call an “extremely valuable” service, especially as new artificial intelligence technology threatens to exacerbate its issues..
“In the years ahead, the CyberTipline will likely be inundated with highly realistic AI content, making it even more challenging for law enforcement to identify real children in need of rescue,” explained Shelby Grossman, one of the report's authors.
The CyberTipline was established by Congress as the primary defense for children exploited online. By law, tech companies must report any child sexual abuse material they find on their platforms to the system operated by the National Center for Missing and Exploited Children. After receiving the reports, NCMEC tries to locate the individuals who sent or received the material, as well as the victims if possible. These reports are then passed on to law enforcement.
Despite the overwhelming number of CyberTipline reports, researchers assert that volume is only one of several fundamental issues with the system. For example, many reports sent by tech companies such as Google, Amazon, and Meta lack crucial details, such as adequate information about an offender’s identity. This complicates law enforcement's ability to determine which reports should be prioritized.
“The entire system currently has significant problems, which will only worsen in a world where AI generates entirely new CSAM,” said Stanford lecturer and cybersecurity expert Alex Stamos, referring to child sexual abuse materials.
The system lags behind technologically and is plagued by an ongoing challenge faced by government and nonprofit tech platforms: the scarcity of highly skilled engineers, who are often offered much higher salaries in the tech industry. In some cases, these employees are even recruited by the same companies that submit the reports.
Additionally, there are legal limitations. According to the report, court rulings have led NCMEC staff to stop screening some files (for example, if they are not publicly available) before passing them to law enforcement. Many law enforcement officials believe they require a search warrant to access such images, leading to delays. Sometimes, they need multiple warrants or subpoenas to identify the same offender.
The system is also susceptible to distractions. The report discloses that NCMEC recently received a million reports in a single day due to a meme circulating on various platforms — some found it humorous while others shared it out of indignation.
“That day prompted them to make some changes,” said Stamos. “It took them weeks to address the backlog” by making it easier to group those images together.
The CyberTipline got over 36 million reports in 2023, nearly all from online platforms. Facebook, Instagram and Google were the companies that sent in the most reports. The total number has been increasing a lot.
Almost half of the tips sent last year were actionable, which means NCMEC and law enforcement could follow up.
Hundreds of reports were about the same offender, and many included multiple images or videos. About 92% of the reports filed in 2023 involved countries outside the U.S., a big change from 2008 when most involved victims or offenders inside the U.S.
Some are false alarms. “It frustrates law enforcement when they get these reports that they think are definitely adults,” Grossman told reporters. “But the system encourages platforms to be very cautious or to report potentially questionable content, because if it’s found to have been CSAM and they knew about it and didn’t report it, they could get fines.”
One relatively simple solution suggested in the report would improve how tech platforms label what they are reporting to distinguish between widely shared memes and something that deserves closer investigation.
The Stanford researchers talked to 66 people involved with the CyberTipLine, including law enforcement, NCMEC staff, and online platform employees.
The NCMEC said it looked forward to “exploring the recommendations internally and with key stakeholders.”
“Over the years, the complexity of reports and the seriousness of the crimes against children continue to evolve. Therefore, using new technological solutions in the entire CyberTipline process leads to more children being protected and offenders being held accountable,” it said in a statement.
Among the report’s other findings:
— The CyberTipline reporting form doesn’t have a dedicated field for submitting chat-related material, such as sextortion messaging. The FBI recently warned of a “huge increase” in sextortion cases targeting children — including financial sextortion, where someone threatens to release compromising images unless the victim pays.
— Police detectives told Stanford researchers they are having a hard time persuading their higher-ups to prioritize these crimes even after they present them with detailed written descriptions to emphasize their gravity. “They wince when they read it and they don’t really want to think about this,” Grossman said.
— Many law enforcement officials said they were not able to fully investigate all reports due to time and resource constraints. A single detective may be responsible for 2,000 reports a year.
— Outside the U.S., especially in poorer countries, the challenges around child exploitation reports are especially severe. Law enforcement agencies might not have reliable internet connections, “decent computers” or even gas for cars to execute search warrants.
— Pending legislation passed by the U.S. Senate in December would require online platforms to report child sex trafficking and online enticement to the CyberTipline and give law enforcement more time to investigate child sexual exploitation. Currently, the tipline doesn’t offer straightforward ways to report suspected sex trafficking.
While some supporters have suggested more invasive surveillance laws to catch abusers, Stamos, the former chief security officer at Facebook and Yahoo, stated that they should attempt simpler solutions first.
“There’s no reason to infringe on the privacy of users if you want to imprison more pedophiles. They’re right there,” Stamos stated. “The system is not very effective at using the existing information to pursue legal action.”