Google’s latest attempt to battle the spread of child sexual abuse material (CSAM) online comes in the form of an AI that can quickly identify images that haven’t been previously catalogued.
It’s part of the company’s Content Safety API, which is available to NGOs and other bodies working on this issue. By automating the process of rifling through images, the AI not only speeds things up, but also reduces the number of people required to be exposed to it – a job that can take a serious psychological toll.
Given that the UK’s Internet Watch Foundation (IWF) found nearly 80,000 sites hosting CSAM last year, it’s clear that the problem isn’t close to being contained. Google’s been on this mission for several years now, and it’s certainly not the only tech firm that’s taking steps to curb the spread of CSAM on the web.
It previously began removing search results related to such content back in 2013, and subsequently partnered with Facebook, Twitter and the IWF to share lists of file hashes that would help identify and remove CSAM files from their platforms.
Find out more about the new service, which is available for free to NGOs, on this page.
The post Google sics its new AI on child abuse images online appeared first on All Camera Driver.
source http://allcameradriver.com/google-sics-its-new-ai-on-child-abuse-images-online/
No comments:
Post a Comment