advertisement
Adobe has released three open source tools intended to make it easier to verify the authenticity of visual content and trace where it came from.
These tools are part of Adobe's Content Authenticity Initiative (CAI), first announced in 2019, to counter the spread of visual misinformation (which is often produced through Adobe's own photo and video editing software Photoshop and Premiere Pro).
The company hopes that these tools will help developers embed this technology into their applications. The widespread adoption of this standard might also help the people who create original visual content.
What's Adobe's plan? How does this technology work? Here's all you need to know.
While efforts to tackle visual misinformation have largely focused on using AI to detect deepfakes and other altered media, Adobe is thinking about the problem from another angle.
It wants to make it easy for the public to find out who created the original visual media, and how these photos and videos were altered over time.
This standard, called Coalition for Content Provenance and Authenticity, or C2PA, will also potentially make it easier for creative professionals, artists, and photojournalists to receive credit for their work.
The tools released by Adobe are a way to push more developers to adopt this technology.
Attribution information – the who, what, and how of asset creation and modification – is typically embedded in the metadata of digital assets, where it can easily be changed or erased.
C2PA data, in contrast, "is cryptographically sealed and verifiable by an individual or organization along the path from creation to consumption".
In practice, using the tools provided by Adobe, a social media platform could let its users easily see the content credentials all of its images and videos by just hovering their mouse over an icon.
Adobe’s open source tools are:
JavaScript SDK – A toolkit to help developers need to create ways to display content credentials (through an icon, for example).
C2PA Tool – A command line utility that would help developers let their application interact with the C2PA standard, to create, verify, and explore content credentials.
Rust SDK – A toolkit to help developers build custom apps that create, verify, and display content credentials directly via Adobe's library of pre-compiled code.
Often app makers use software development kits (SDKs), which are essentially ready-made software kits from third party developers, to cut down on time and effort.
Adobe told Techcrunch that the C2PA standard is receiving a "surprising amount of inbound interest" from companies which produce synthetic images and videos, like deepfakes.
Deepfake technology involves using artificial intelligence (AI) to generate convincing images or videos of made-up or real people. It is surprisingly accessible and has been put to various uses, including in entertainment, misinformation, harassment, propaganda and pornography.
A recent study published in the Proceedings of the National Academy of Science found that people have just a 50 percent chance of guessing correctly whether a face was generated by artificial intelligence. AI synthesised faces were found to be indistinguishable from real faces and, somehow, more trustworthy.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)