Coronavirus has led to a “global slowdown” in the removal of internet child abuse images, say campaigners.
The Internet Watch Foundation says tech firms have fewer staff to delete illegal material, making it easier for sexual predators to view and share.
Almost 90% fewer suspicious web addresses, or URLs, have been deleted during the pandemic, says the charity.
The warning comes as the IWF’s annual report reveals Europe is the “hub” for child sexual abuse photos and videos.
In 2019, 89% of URLs containing abuse material were found on computer servers based in Europe, compared with 79% in 2018.
Servers in the Netherlands, which has a strong technological infrastructure and low costs, hosted the most illegal content discovered by IWF staff – 93,962 URLs, or 71% of the total.
“We have seen a real and frightening jump in the amount of child sexual abuse material that is being hosted right on our doorstep here in Europe,” IWF chief executive Susie Hargreaves said.
Countries must adopt a “zero tolerance” strategy to the problem by tackling supply and demand, Ms Hargreaves added.
“While the UK doesn’t have this ‘hosting’ issue, our problem is that many consumers of child sexual abuse live here,” she pointed out.
She praised staff at the charity who last year removed 132,676 web pages and newsgroups showing child sexual abuse material, after assessing reports from people across the globe.
“It doesn’t matter how often the team sees this content, they never lose their humanity or fail to be shocked by the level of depravity and cruelty that some, a minority, engage in,” she said.
The immediate problem identified by the IWF is that social-distancing and self-isolation rules have cut the number of staff able to flag and respond to reports of illegal content in technology companies, call centres and law enforcement.
As a result, it is taking longer for child abuse images to be removed.
Between 16 March and 15 April, 1,498 URLs were deleted compared with 14,947 in the previous four weeks.
“Hotlines and abuse teams across the globe need to be aware there is a slowdown of this content being removed and to be mindful of doing what they can, within their ability, to get this content taken down,” the charity said.