EmbreuÅ¡ Marko - stock.adobe.com
Concerned parents have sent hundreds of stuffed animals to the home of Amazon CEO Jeff Bezos in protest against the e-commerce and cloud computing giant’s alleged failure to report child sexual abuse material (CSAM).
On 29 July, the day Bezos testified before Congress alongside other tech CEOs, members of ParentsTogether, an online community of more than two million US-based parents, also staged a demonstration with teddy bears outside Capitol Hill.
The issue of CSAM was not discussed during the hearing, however, which focused on the market power and dominance of major technology companies.
Amidst a crisis of child sexual abuse and exploitation online, which has surged during the Covid-19 coronavirus pandemic, ParentsTogether has warned that Amazon is failing to report images of child sexual abuse material to the appropriate authorities at the necessary volume required to keep children safe online.
It said 69 million photos and videos of children being sexually abused were reported to the National Center for Missing and Exploited Children last year, but that just eight of those reports came from Amazon Web Services (AWS).
By comparison, it said other major technology companies, such as Google, Facebook and Apple, had each made hundreds of thousands, even millions, of reports.
“Amazon Web Services has vast resources, controls a third of cloud infrastructure services, and handles billions of uploads and downloads. Their abysmal failure to report CSAM makes it clear they are not looking for it,” said Amanda Kloer, campaign director at ParentsTogether.
“Jeff Bezos and Amazon turning a blind eye to predators sharing images of children being sexually abused is inexcusable. Amazon must do more to proactively find and report all known CSAM and make keeping kids safe online a priority, not an afterthought.”
Amanda Kloer, ParentsTogether
When asked by Computer Weekly if there was a specific instance of AWS leaving abuse up that triggered the protests, ParentsTogether responded: “Sadly, it’s all too easy to find examples of CSAM on Amazon.
“Some recent examples of people using Amazon services to store and share child sexual abuse material: a Colorado man arrested last week with images of infants and toddlers on his Kindle; a UK teaching assistant who used Amazon vouchers to purchase CSAM; and reports from several of Amazon’s Mechanical Turk workers have complained about viewing CSAM without warning.”
The group added: “Because they refuse to find and report child sexual abuse material, it’s impossible to know exactly how much they host. However, this a problem that is endemic across the internet. Amazon’s peers have demonstrated that child sexual abuse material exists on all platforms, across all companies… It is inconceivable that Amazon is somehow immune to this problem.”
In a petition started by Kloer on the issue, which as of publication has reached 2,323 signatures, she said: “Amazon’s refusal to report CSAM is part of a growing trend of big tech companies pushing the full responsibility of keeping kids safe online onto parents, many of whom are now in the impossible position of working, parenting, teaching and policing technology all at the same time.
“Tech companies like Amazon are making billions from families, and they must share the responsibility to keep kids safe. By proactively searching for and reporting CSAM photos and videos, Amazon could save thousands of children from abuse and revictimisation.”
In response to questions from Computer Weekly about the allegations and what AWS is doing to address the protestors concerns, an Amazon spokesperson reiterated a statement given to The Telegraph in November 2019: “Our acceptable use policy clearly prohibits illegal content, and it is not tolerated. When notified of illegal content on our network, we respond quickly and take action to remove it.
“When we receive notifications of illegal content from the Internet Watch Foundation (IWF), the National Center for Missing and Exploited Children (NCMEC) in the US, or any of the other agencies we work with, we take immediate action with this type of harmful content.
“We also proactively fight against some of the worst types of crime. We work with organisations such as IWF, law enforcement, Marinus Analytics, NCMEC, Thorn, and many others. These organisations use our advanced machine learning technologies to scour the internet in order to fight child exploitation and trafficking.”
The IWF, founded in 1996, is largely funded by annual membership subscriptions from tech companies, with Facebook, Amazon, Microsoft and Google among its highest contributors, paying more than £80,000 to the organisation.
In June 2020, Amazon also implemented a one-year moratorium on police use of its facial-recognition software after an international backlash over its ties to law enforcement following the police murder of George Floyd.
However, Amazon said at the time it would continue to allow some organisations to use its Rekognition software if they were working on finding missing children or preventing human trafficking – specifically naming Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics as organisations that this applied to.
Read more about online safety
- The UK government should act immediately to deal with a “pandemic of misinformation” and introduce a draft online harms bill as a matter of urgency to rebuild trust in democratic institutions, report warns.
- Technology companies come together to launch a UK industry association dedicated to tackling online safety, with support and backing from the government, campaigners and charities.
- Internet Watch Foundation’s work on tackling the spread of child sexual abuse imagery is being enhanced with advanced video tagging technology and artificial intelligence.