Digital Solution in Action: Building Media Trust with Content Authenticity and Provenance

Share this item

STORY

In an era of fast-moving digital content and AI-generated media, the ability to prove what’s real is no longer optional – it’s essential.  

That’s the focus of a new white paper by RNW Media, created to support journalists, digital content creators and independent media organisations with practical tools and strategies for preserving authenticity in their work. 

To accompany this, a public-facing guide is available, offering content creators a starting point to explore verification tools and build transparency into their workflows.

RNW Media recently became a member of the Content Authenticity Initiative, an Adobe-led effort to combat misinformation by showing where digital content comes from, who created it and whether it’s been edited. That commitment – and the growing need for practical guidance – is what led RNW Media to publish a new white paper on the topic. From the start, the paper positions content authenticity and provenance not as a reaction to technology, but as an opportunity to strengthen trust, transparency and creators’ rights in a changing media landscape. 

“[The] resource is not just a guide,” says Ana Garza Ochoa, our Digital Media Lead. “It’s a call for cross-industry collaboration to build integrity into our digital content workflows while embracing the possibilities of AI in a responsible, ethical way.” 

What Do We Mean by Content Authenticity and Provenance? 

“Is this real or fake?” That’s the question some people might ask when navigating content online, says Natalie Wang, Lead of Global Outreach at Numbers Protocol and artist whose work explores the intersection of AI and creative expression. Natalie recently led a Masterclass at RNW Media. In the session, she informed participants on techniques to evaluate synthetic media and preserve authorship across digital platforms. However, the issue of authenticity goes beyond detection. 

“Content authenticity is about knowing who created a piece of media, when and where it was made and whether it’s being used responsibly,” she explains. “That’s what provenance helps with. It gives audiences confidence and helps creators protect their work.” 

Natalie brings a deeply personal lens to the issue. As an oil painter and digital creator from Taiwan, she’s experienced both the vulnerability of sharing artwork online and the societal harms of misinformation. “Taiwan has seen how AI-generated hoaxes or disinformation can undermine trust. That reality drives me to work on tools that support both truth and creativity.” 

Content authenticity framework Creation, Provenance, Authenticity, Trust
Pillars of the content authenticity framework

Navigating AI with Caution and Confidence 

Rather than seeing AI as a threat, both Natalie and Ana frame it as a tool which needs thoughtful oversight and accessible education. 

“AI has made it easier than ever to generate lifelike content,” Natalie says. “But instead of panic, we need tools that help us keep context. When we can show the story behind an image or video, we build trust and protect creators.” 

Ana agrees: “We don’t see AI as inherently good or bad. But we do see the urgency to improve understanding, especially for media organisations working to serve the public interest. Our aim is to equip them to navigate AI in ways that promote human rights and civic engagement.” 

Real Tools, Real Impact 

For media teams navigating the rise of AI-generated content, the first step doesn’t have to be overwhelming. In fact, some of the most effective solutions are simple and already available. 

“Create a list of provenance tools. Revisit your editorial guidelines. Add AI content logs,” says Ana. “These first steps make a real difference. They show your audience that transparency matters.” 

Natalie also highlights tools that are already making a difference for creators. She shares a story from her experience at Numbers Protocol. A photographer, Koen Van Damme, once struggled to prove authorship during a copyright dispute. After that experience, Koen started using Capture Dashboard – an app that allows creators to register their work on the blockchain. Now, every photo he takes is securely timestamped and traceable, transforming both his workflow and revenue model. 

Other tools offer easy ways to build authenticity from the moment content is created. CaptureCam, for example, lets users record video and image content with embedded metadata showing when and where it was captured. Meanwhile, the CP2A Guide for ChatGPT can help media makers track and label AI-generated content, offering practical tips for logging prompts, outputs and usage. This creates clarity on what’s machine-generated and how it’s been used.  

A Guide for the Sector 

RNW Media’s public-facing guide is written specifically for those on the frontlines: journalists, editors, civic media initiatives and NGOs. It offers toolkits and a step-by-step framework to help your team build content authenticity into everyday workflows. In addition, it draws from real-world practice, focus group insights and conversations with partners in Kenya, Nepal, Lebanon, Yemen and beyond. 

“Although open-source tools for authenticity have existed for a while, their adoption across the media industry has been slow,” Ana notes. “But that’s changing. As information environments become more complex, these frameworks are finally being recognized as essential infrastructure.” 

What makes this guide valuable is its blend of hands-on advice and ethical reflection. From protecting creators’ rights to navigating AI’s societal impact, it empowers media professionals to act responsibly without falling behind on innovation.

Read the Guide

Work with us

Would you like to know more about our services? Or would you be interested in our masterclasses on content authenticity and AI and journalism? Get in touch with Ana Garza Ochoa through this form.

We value your privacy

We use cookies to enhance your browsing experience and analyze our traffic. By clicking “Accept & Close”, you consent to our use of cookies. Please see Cookie Policy