September 17, 2024
New Tech From Camera Makers Tries to Prove Photos Are Not AI Fakes
  • Nikon, Sony, and Canon are all adding tamper-resistant digital watermark tech to their cameras. 
  • You will be able to use software to check they haven’t been changed. 
  • It won’t help stem the flood of AI-generated deepfakes on social media.


AI Face Swap for Deep Fake.

Tero Vesalainen / Getty Images



Nikon, Sony, and Canon will add new tech to embed digital signatures into photos so users can prove their photos are not AI-generated. It’s good for journalists and photo editors but is little more than a band-aid on a big problem: the loss of trust. 


With the rise of photorealistic AI-generated images and deep fakes posted as real on social media, it’s easy to falsely shape the news narrative. We tend to trust photographs and video in a way we don’t trust other media, which makes it even more important that we can verify photos are genuine. And these three leading camera makers want to help regain this trust and presumably ensure that photojournalism remains relevant. 


“The importance of proving a photo’s reality cannot be overstated, especially for professionals like journalists and photo editors, whose work’s integrity is paramount. With so much AI-generated content all around us, this technology stands out as a reliable source of authenticity in a world where misinformation is common,” Brian Prince, founder and CEO of AI educational platform Top AI Tools, told Lifewire via email.

“It’s not merely about distinguishing real from fake; it’s about upholding the values of truth and credibility that are fundamental to these professions.”



Table of Contents

Real Fake

The new standard authentication tech that Sony, Nikon, and Canon are adding to cameras embeds a tamper-resistant digital signature into every image captured. This contains data like date, time, location, and the photographer’s name, and can be used to authenticate that the image has not been changed in any way. 



Sony will add this feature to existing cameras this spring via a firmware update, and Canon will release an image authentication app to do the checking. 


One use case could be for a news publication to require these digital signatures from all staff and contributing photographers in order to satisfy itself that submitted photos have not been fiddled with. The reader of those publications could then trust that what they see is real, although, of course, you’d have to trust the editors of that magazine or news website. 


This is essential if we are to have any hope of continuing to use photos as evidence of something real happening. 


“Being able to prove a photo is authentic will have important legal and cultural implications, especially as AI images become indistinguishable from the real thing,” Gareth Barkin, dean of operations and technology at the University of Puget Sound, told Lifewire via email. “But proving some photos are authentic is not the same as being able to clearly identify whether something is or isn’t an AI image. Images could have plausibly been generated by older gear, different photo equipment brands, or phone cameras.”



Social Mess

But even if all camera makers signed on for this tool, it only helps honest photographers to prove that they are honest. It won’t necessarily help stop the dissemination of AI-generated fakes on social media or by bad-acting or unscrupulous media outlets. When Israel invaded Palestine last year, the number of faked images on social media was alarming. 



Deepfake statement.

Olivier Douliery / Getty Images



For example, stories used AI-generated images, and some images were just genuine pictures from elsewhere, marked as taking place in Gaza. To add to the confusion, some AI-spotting tools marked genuine photos as fakes, and Adobe—yes, that Adobe—was selling AI-generated “photos” of destroyed buildings in Gaza. 


In order to stop this fakery, we’d need to make sure every camera, even years-old models and every phone camera, included the same watermarking feature.


“To give us the tools we need to really distinguish AI-generated images from real photos and return trust to photography, all camera and phone manufacturers would need to sign on to this digital signature standard,” says Barkin 


On top of that, you’d need to educate people to check these watermarks and make it easy to do. You’d also have to deal with sources that claimed to have checked that an image was “100% genuine real, not fake” when posting it to social media. 


That’s a tall order and a potential disaster that may end up changing our relationship with photography after more than a century of trusting it. 


“When we trust a photo, we trust the photographer’s eye, their intention, and their ability to capture a genuine slice of reality. This trust allows us to engage with the image on a deeper level, to connect with the emotions or perspective it conveys. It’s this element of trust that gives photography its power to move us, inform us, and even challenge our ideas,” photographer Crissibeth Cooper told Lifewire via email.

link

Leave a Reply

Your email address will not be published. Required fields are marked *