Microsoft partners with StopNCII to tackle intimate imagery abuse

Microsoft partners with StopNCII to tackle intimate imagery abuse

Unsplash/Surface

Service will improve the detection of inappropriate imagery, including those generated by AI, on Bing 

Amber Hickman |


Microsoft has partnered with international charity StopNCII to help tackle non-consensual intimate imagery (NCII) on Bing. 

The StopNCII platform, run by the South West Grid for Learning, allows adults to protect themselves against intimate images, including deepfakes generated by artificial intelligence, being shared online without their consent by creating a digital fingerprint of their images. These digital fingerprints can then be used by Microsoft to detect imagery on Bing.  

Microsoft began the pilot trial with the StopNCII database in March 2024 and since then has acted on over 268,000 images.  

Microsoft will also continue working with its multistakeholder working group — led by the Centre for Democracy & Technology, Cyber Civil Rights Initiative and the National Network to End Domestic Violence — as well as the US Department of Commerce’s National Institute of Standards & Technology and the AI Safety Institute to continue efforts to protect its users. 

“At Microsoft, we recognise that we have a responsibility to protect our users from illegal and harmful online content while respecting fundamental rights,” said Courtney Gregoire, chief digital safety officer at Microsoft in an online blog post. “We strive to achieve this across our diverse services by taking a risk proportionate approach, tailoring our safety measures to the risk and to the unique service.” 

Subscribe to the Technology Record newsletter


  • ©2024 Tudor Rose. All Rights Reserved. Technology Record is published by Tudor Rose with the support and guidance of Microsoft.