They could make it difficult to open up the camera and extract its signing key, but only one person has to do it successfully for the entire system to be unusable.
In theory you could have a central authority that keeps track of cameras that have had their keys used for known-fake images, but then you’re trusting that authority not to invalidate someone’s keys for doing something they disagree with, and it still wouldn’t prevent someone from buying a camera, extracting its key themselves, and making fraudulent images with a fresh, trusted key.
Anything from now that people want to authenticate in the future, they can publish the hash of.
So long as people trust the fact that the hash was published now, in the future when it’s fakable they can trust that it existed before the faking capability was developed.
I’m guessing it wouldn’t work for a variety of reasons, but having cameras digitally sign the image+the metadata could be interesting.
They could make it difficult to open up the camera and extract its signing key, but only one person has to do it successfully for the entire system to be unusable.
In theory you could have a central authority that keeps track of cameras that have had their keys used for known-fake images, but then you’re trusting that authority not to invalidate someone’s keys for doing something they disagree with, and it still wouldn’t prevent someone from buying a camera, extracting its key themselves, and making fraudulent images with a fresh, trusted key.
Anything from now that people want to authenticate in the future, they can publish the hash of.
So long as people trust the fact that the hash was published now, in the future when it’s fakable they can trust that it existed before the faking capability was developed.