• Ben's Bites
  • Posts
  • Camera makers want to watermark human images (not AI ones)

Camera makers want to watermark human images (not AI ones)

The big camera companies Nikon, Sony, and Canon are working on new tech to add digital signatures to photos. This will help people spot fake AI-generated images.

What's going on here?

Leading camera makers want to find a solid way to tell apart real photos from images edited with AI.

What does this mean?

Deepfakes are getting crazy good these days. It's getting really hard to know what's real anymore. To fight this, Nikon, Sony, and Canon are creating built-in features to add tamper-proof digital signatures to photos taken by their new cameras. These signatures will include key details like when and where the photo was taken, and who took it.

These camera brands have teamed up to make a standard for these digital signatures, that can be checked through a free online tool called Verify. If a photo doesn't have credentials, it flags it as potentially fake.

Sony and AP tested a version of this for journalism-related use cases and Sony is planning to bring the tech to its cameras via a firmware update. Nikon and Canon are planning to release new cameras with similar features.

Why should I care?

Adobe and Google have extensively talked about embedding watermarks and “made by AI” credentials in their AI-generated images. The approach by camera makers is different—adding watermarks to the “real” photos. Things like this might have implications such as only photos with credentials accepted as proofs in courts.

Join the conversation

or to participate.