I think every touch up besides color correction and cropping should be labeled as “photoshopped”. And any usage of AI should be labeled as “Made with AI” because it cannot show which parts are real and which are not.
Besides, this is totally a skill issue. Removing this metadata is trivial.
Some of the more advanced color correction tools can drastically change an image. There’s a lot of gray in that line as well.
DOD Imagery guidelines state that only color correction can be applied to “make the image appear the same as it was when it was captured” otherwise it must be labeled “DOD illustration” instead of “DOD Imagery”
Cropping can completely change the context of a photo.
Sure But you could also achieve a similar effect in-camera by zooming in or moving closer to the subject
A lot of photographers will take a photo with the intention of cropping it. Cropping isn’t photoshopping.
Image manipulation is still image manipulation
If I open an image in Photoshop and crop it, it’s photoshopping.
You don’t have to open photoshop to do it. Any basic editing software will include a cropping tool.
🤦♂️
There are absolutely different levels of image editing. Color correction, cropping, scale, and rotation are basic enough that I would say they don’t even count as alterations. They’re just correcting what the camera didn’t, and often available in the camera’s built in software. (Fun fact, what the sensor sees is not what it presents you in a jpeg.) Then there are more deceptive levels of editing, like removing or adding objects, altering someone’s appearance, swapping faces from different shots. Those are definitely image alterations, and what most people mean when they say an image is “photoshopped” (and you know that, don’t lie). Then there’s AI, where you’re just generating new information to put into the image. That’s extreme image alteration.
These all can be done with or without any sort of nefarious intent.
Removed by mod
So we agree cropping is plain and simple image editing, yes?
Yes. I think the question was should it be labeled as “photoshopped” (or probably “manipulated”). I don’t think it should. I think those labels would be meaningless if you can’t event change the aspect ratio of a photo without it being called “photoshopped”.
Agreed. Photo editing has great applications but we can’t pretend it’s never used maliciously.
Film too, any trickery in the darkroom should be labeled because it cannot show which parts are real and which are not.
Why label it if it is trivial to avoid the label?
Doesn’t that mean that bad actors will have additional cover for misise of AI?
Yes
What do you mean by real
Better title: “Photographers complain when their use of AI is identified as such”
Removed by mod
People are complaining that an advanced fill tool that’s mostly used to remove a smudge or something is automatically marking a full image as an AI creation. As-is if someone actually wants to bypass this “check” all they have to do is strip the image’s metadata before uploading it.
But they did use AI…
Right? I thought I went crazy when I got to “I just used Generative Fill!” Like, he didn’t just auto adjust the exposure and black levels! C’mon!
No - I don’t agree that they’re completely different.
“Made by AI” would be completely different.
“Made with AI” actually means pretty much the exact same thing as “AI was used in this image” - it’s just that the former lays it out baldly and the latter softens the impact by using indirect language.
I can certainly see how “photographers” who use AI in their images would tend to prefer the latter, but bluntly, fuck 'em. If they can’t handle the shame of the fact that they did so they should stop doing it - get up off their asses and invest some time and effort into doing it all themselves. And if they can’t manage that, they should stop pretending to be artists.
I think it is a bit of an unclear wording personally. “Made with”, despite technically meaning what you’re saying, is often colloquially used to mean “fully created by”. I don’t mind the AI tag, but I do see the photographers point about it implying wholesale generation instead of touchups.
deleted by creator
I totally agree with a streamlined identification of images generated by an AI prompt. But, to label an image with “made with AI” metadata when the image is original, taken by a human, and simply used AI tools to edit is absolutely misleading and the language can create confusion. It is not fair to the individual who has created the original work without the use if generative AI. I simply propose revising the language to create distinction.
deleted by creator
Where I live, is very difficult to get permits to knock down an old building and build a new one. So, builders will “renovate” by knocking down everything but a single wall and then building a new structure around it.
I can imagine people using that to get around the “made with ai” label. I just touched it up!
It’s like they’re ignoring the pixel I captured in the bottom left!
Really interesting analogy.
Also I imagine most anybody who gets a photo labeled will find a trick before making their next post. Copy the final image to a new PSD… print and scan for the less technically inclined… heh
I mean you can just remove the metadata of any image, so that doesn’t really matter.
simply used AI tools
Therefor, made with AI.
Or generated with AI like midjourney, therefore, made with AI.
There a huge difference between the two, yet, no clear distinction when all lumped into the label of “made with AI”
yeah, i use Lightroom ai de-noise all the time now. it’s just a better version of a tool that already existed. and once that every phone does by default anyway.
And I use AI to determine the right brightness level for my phone screen (that was a feature added several android versions ago)
The label is accurate. Quit using AI if you don’t want your images labeled as such.
Artists in 2023: “There should be labels on AI modified art!!”
Artists in 2024: “Wait, not like that…”
I feel like these are two completely different sets of artists.
no, they just replaced the normal tools with ai-enhanced versions and are labeling everything like that now.
ai noise reduction should not get this tag.
I don’t know where you got they from, but this post literally talks about tools such as the gen fill (select a region, type what you want in it, AI image generation makes it and places it in)
or… don’t use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there’s no way of knowing how much has changed.
there’s a lot more than generative fill.
ai denoise, ai masking, ai image recognition and sorting.
hell, every phone is using some kind of “ai enhanced” noise reduction by default these days. these are just better versions of existing tools than have been used for decades.
the post says gen fill
This would be more suited for asklemmy, this community isn’t for opinion discussions
Can’t wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag
Generative fill on a dummy layer, then apply 0% opacity
That’s the difference between “by” and “with”.
People have a hard time with nuance.
Bad photographers complaining to be called out as bad photographers.
I don’t think that’s fair. AI wont turn a bad photograph into a good one. It’s a tool that quickly and automatically does something we’ve been doing by hand untill now. That’s kind of like saying a photoshopped picture isn’t “good” or “real”. They’re all photoshopped. Not a single serious photographer releases unedited photos except perhaps the ones shooting on film.
Even finns photographers touch up their photos, either during development by adjusting how long they sit in one or the chemical processes or by using different methods of shaking/mixing processes and techniques.
If they enlarge their negatives on photo paper they often have tools to add lightness and darkness to different areas of the paper to help with exposure, contrast and subject highlighting. AKA. Dodging and burning which is also available in most photo editing software today.
There are loads of things to do to improve developed photos and been something that has always been something that photographers/developers do. People who still go with the “Don’t edit photos” BS are usually not very well informed about photo history and techniques of their photography inspirations.
Why many word when few good?
Seriously though, “AI” itself is misleading but if they want to be ignorant and whiny about it, then they should be labeled just as they are.
What they really seem to want is an automatic metadata tag that is more along the lines of “a human took this picture and then used ‘AI’ tools to modify it.”
That may not work because by using Adobe products, the original metadata is being overwritten so Thotagram doesn’t know that a photographer took the original.
A photographer could actually just type a little explanation (“I took this picture and then used Gen Fill only”) in a plain text document, save it to their desktop, and copy & paste it in.
But then everyone would know that the image had been modified - which is what they’re trying to avoid. They want everyone to believe that the picture they’re posting is 100% their work.
We’ve been able to do this for years, way before the fill tool utilized AI. I don’t see why it should be slapped with a label that makes it sound like the whole image was generated by AI.
This isn’t really Facebook. This is Adobe not drawing a distinction between smart pattern recognition for backgrounds/textures and real image generation of primary content.