Alex Smith |
The use of misinformation to cause chaos and disruption is far from a new phenomenon – false and misleading stories have been weaponised to create doubt and manipulate public opinion since ancient times. The digital world, however, has unlocked new and faster ways of spreading fake stories using tools that can alter or fabricate media.
Misinformation has therefore become more dangerous than ever, with faked media being used for everything from financial scams and revenge porn to political propaganda and conspiracy theories. The impact upon trust in media has been stark. According to the Reuters Institute’s Digital News Report 2024, trust in the news is at just 40 per cent across the 47 countries surveyed, with 59 per cent of respondents saying they are concerned about fake news online.
Rebuilding that trust will be no easy task and will require the efforts of organisations across many different sectors, says Andrew Jenks, director of media provenance at Microsoft.
“All organisations, not just media companies, must create more transparency to help their employees, customers and broader audiences quickly navigate and feel more confident about the information they’re consuming,” says Jenks. “Some of this is achieved through technology like tools and classifiers, but organisations should think through education opportunities as well. People do not consume media the same way they did 20 years ago or even five years ago. This shift warrants new ways of thinking about how organisations can help people keep informed and navigate this transformation.”
Microsoft has been working to develop the necessary solutions for certifying the provenance – the origin, authenticity and history – of online media since 2019. Efforts by researchers and engineers in the Microsoft Research team led to the development of the Authentication of Media via Provenance (AMP) system, in which a manifest for each piece of media is uploaded by a content provider to a central database. These manifests can then be quickly searched by applications such as browsers to help verify the integrity of that piece of media in the future.
To build upon the AMP system, Microsoft then worked with companies including Adobe, the BBC, Truepic and others to found the Coalition for Content Authentication and Provenance (C2PA), in February 2021. Collaborating as part of C2PA, the partners developed Content Credentials, an industry standard system that follows the AMP system of creating a detailed manifest that logs any actions taken to edit or convert the media, along with a cryptographic signature to make tampering detectable.
“C2PA serves as a standards body and is responsible for creating the interoperable, open technical specification for Content Credentials, guidance in its usage, and recommendations for implementation which enables organisations to use provenance technology,” says Jenks. “The organisation has grown from a handful of members in 2021 to over 250 today, including Amazon, Google, LinkedIn, Meta, OpenAI, Samsung, and many more. Microsoft is a steering committee member and continues to be active across the organisation.”
Content Credentials is an industry standard provenance system developed by partners in the Coalition for Content Authentication and Provenance such as Adobe
Microsoft also helped to found Project Origin, an alliance of organisations from the publishing and technology sectors that are working together to create a process where the provenance of news content can be confirmed. Now an International Press Telecommunications Council project, the alliance aims to maintain confidence in news from verified providers.
“Microsoft is proud to continue to support its work with media organisations and journalists worldwide,” says Jenks.
Collaboration between technology and media partners will also be required to find solutions for new sources of misinformation, including the growing use of AI tools for media manipulation. Recent developments in AI have had a transformational impact on media production and distribution, as well as enabling significant breakthroughs in healthcare, education, manufacturing and other industries. But in the wrong hands, AI could also prove to be a powerful weapon that can be used to create fake stories, images or videos.
As a leader in AI development, Microsoft has a significant role to play in ensuring the technology is used safely. It has engaged with non-profit and civil society organisations to help and support broader industry standards for synthetic media. One example is its participation in work led by the Partnership on AI to establish a framework for responsible practices, including a case study on LinkedIn Content Credentials implementation.
“As history has taught us, we need to proactively address risks,” says Jenks. “Our strategy to protect people and communities from harmful AI content is based on six focus areas: a strong safety architecture; durable media provenance and watermarking; safeguarding our services from abusive content and conduct; robust collaboration across industry and with governments and civil society; modernised legislation to protect people from the abuse of technology; and public awareness and education.”
According to Jenks, combating the proliferation of fake media will ultimately require a multilayered approach similar to that used for security and privacy, with both technology and education having a role to play.
“Content Credentials is a powerful tool that is even stronger when combined with other technologies, such as watermarking and fingerprinting, to create a durable Content Credential,” he explains. “Beyond technology, we believe that public education is imperative due to fundamental changes in information sharing and media consumption. From children to senior citizens, we need to ensure that everyone has strategies to help identify the origin of the content and think critically about the content they consume.”
Discover more insights like this in the Spring 2025 issue of Technology Record. Don’t miss out – subscribe for free today and get future issues delivered straight to your inbox.