by PAUL MELCHER
The battle is on. The forces of truth against forces of deception. With visual AI making it easier to fake visual content, its credibility is at stake. And with it, the income of thousands upon thousands of people worldwide who depend on the credibility of visuals to thrive: newspapers, magazines, photographers, newswires, webcasters, television news, videographers, journalists, and photo agencies, among many others.
The stakes are very high. If we no longer trust our photos or videos, we lose our primary knowledge of what is truly happening outside of our immediate surroundings. With it, the ability for democracies and societies to function properly.
Trust in Accountability
The best way to beat (as in, uncover its deceptive intent) deepfakes and manipulated images is by revealing their source. If we know who is the creator of a video or photograph, we are more likely to know if it is real or not. If anything, we will be alerted to its potential to deceive.name on it so I can keep track of. Who’s
in I’m not going over? One hundred I just
When we read an article, we seek to find out the author to assess its level of credibility. An author with a long history of proven trust reliability (and a few Pulitzer to her name) will undoubtedly be more trusted than someone just starting out of college.
The publication, often, will increase or decrease someone’s credibility. An article written by a complete unknown published in the New York Times will be trusted. The same article published by the same author on Breitbart, much less likely so.
Revealing authorship also forces accountability. It creates a referential history. Together, they are clear markers for trust. While deception hides in the shadows of anonymity, truth needs no filters. Thus, enforcing authorship creates a higher level of credibility.
The Content authenticity Initiative
This is precisely what the NYT, Adobe, and Twitter have decided to put in place with the Content Authenticity Initiative launched in November of 2018. Create a framework that allows for the creation and preservation of the authorship of visual content, one that can be used by anyone from publishers to tech companies. With it, consumers will be able to identify the source of a photo or video, allowing them to make an informed decision on the credibility of a file.
To establish authenticity, one has to define both authorship and integrity clearly: The file was created by this person/entity ( authorship), and the record has not been tampered with. Authorship, as we have seen above, is fundamental to a file’s credibility. Without it, it is impossible to assess the intent. Integrity is showing if and how the file was edited. It’s a show of hands, publicly declaring any alteration. Any intention to deceive, like an object/person removal or replacement, can be clearly acknowledged.
It seems simple enough. But there are many issues. Claiming authorship should be a choice, not an obligation. Under a totalitarian regime, authors of photos or videos will not want to have files associated with them, as they would risk imprisonment or death. But letting authors/creators decided if and when they declare ownership changes little of today’s situation.
The same goes for integrity. Graphic designers and Photoshop artists are not likely to publish their file editing history as it would be releasing their trade secrets. Thus alteration history should also be an option and not an obligation, leaving today’s situation pretty much unchanged.
Will it Work?
For a project like the CAI to work, strong incentives should be in place. It could come from various sources. Publications, like the NYT or Vice News, could decide not to accept photos/videos that do not have follow the Authenticity Framework. Platforms like Twitter or Facebook could degrade posting who publish non-authenticated content. Anonymous content would need to be vetted by trusted publishers.
Authenticity could also be linked to copyright and license payments, as proposed by Article 17 of the European directive on Copyright. As a framework on authorship, the CAI could be used by tech companies to redistribute revenue to creators. Even designers might change their minds if there is revenue involved.
Out of six senses, vision is, by far, the one we trust the most for critical information. Studies show that if receiving conflicting information from our senses of sound and touch, for example, vision is always the sense we rule correct. It is our primary source of trust. If we see it, it exists. If not, it might not be real. If we can no longer believe what we see, our world will be torn apart.
About the author: Paul Melcher is a photography and technology entrepreneur based in New York, and the founder of Kaptur, a news magazine about the visual tech space. The opinions expressed in this article are solely those of the author. You can find more of his writings on his blog, Thoughts of a Bohemian. Melcher offers his services as a consultant as well. This article was also published here.