AI & IoT
Artificial Intelligence (AI) & Internet of Things (IoT)Image Analyzer wins UK Government grant to develop CSAM-detection technology for end-to-end encrypted services
visual content moderation software company, Image Analyzer, has been selected to receive a share of the UK Government’s Safety Tech Challenge Fund to find new ways to detect Child Sexual Abuse Material (CSAM) sent via encrypted channels, without compromising citizens’ privacy. Image Analyzer will work in partnership with content encryption technology provider, Galaxkey, and digital identity and age verification technology company, Yoti, to develop AI-powered visual content analysis technology that works within messaging services that employ end-to-end encryption.
Cris Pikes, CEO and founder of Image Analyzer commented, “Image Analyzer is delighted to be collaborating with Galaxkey and Yoti to deliver this exciting, first-of-a-kind technology pilot that recognises the importance of protecting users’ data and privacy whilst addressing the inherent risks to children associated with end-to-end encryption. As a ground-breaking technology collaboration, the Galaxkey, Yoti and Image Analyzer solution will enable users to access all of the benefits related to encryption whilst enabling clean data streams and offering reassurance within specific use case scenarios such as educational sharing.”
End-to-end encryption (E2EE) is already included within the WhatsApp and Signal apps. In April, the Home Secretary and NSPCC decried Facebook’s plans to implement E2EE within Instagram and Messenger, citing the increased risks to children if law enforcement agencies cannot compile evidence of illegal images, videos and messages sent by child abusers. The government and child safety organisations warned that E2EE drastically reduces technology companies’ abilities to detect and prevent proliferation of CSAM on their platforms and prevents law enforcement agencies from arresting offenders and safeguarding victims. The NSPCC has estimated that up to 70% of digital evidence will be hidden if Facebook goes ahead with plans to introduce E2EE in Instagram and Messenger.
Child protection experts warn that encrypted messages could shroud evidence of grooming and coercion of children and the sharing of indecent and illegal images and extremist material.
The NSPCC also observed that encryption could create a technical loophole for technology companies to avoid their duty of care to remove harmful material when the Online Safety Bill becomes law, by allowing them to ‘engineer away’ their responsibility to monitor and remove harmful content.
In response to these heightened risks, the Safety Tech Challenge Fund was announced by the UK Government in September. The UK Government has awarded five organisations up to £85,000 each to prototype and evaluate new ways of detecting and addressing CSAM shared within E2EE environments, such as online messaging platforms, without compromising the privacy of legitimate users. The Safety Tech Challenge Fund provides a mechanism for government, the technology industry, non-profit organisations and academics to discover solutions and share best practice. Fund recipients have until March 2022 to deliver their proofs of concept.
Image Analyzer holds European and U.S. patents for its automated, artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by people abusing their digital platform access to share harmful visual material. Image Analyzer’s technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, particularly children and vulnerable adults, with near zero false positives.
Pikes continues, “Law enforcement agencies depend on evidence to bring child abuse cases to court. There is a delicate balance between protecting privacy in communications and drawing a technical veil over illegal activity which jeopardises children. Hidden doesn’t mean it’s not happening. The UK Online Safety Bill will bring messaging apps into scope for the swift removal of harmful text and images. However, where content is encrypted end-to-end, this will significantly reduce Ofcom’s ability to prevent the most serious online harms. Working in partnership with encryption specialists at Galaxkey and identity verification experts at Yoti, we’ll help organisations to address online harms, while protecting the privacy of their law-abiding users.”
For further information please visit: https://www.image-analyzer.com