Look at this verification
Every night it even makes me legitimize
I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.
The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.
The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.
If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.
I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.
Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?
Probably not, but it would still reduce the amount considerably.
Thank goodness we can now use AI to do something that could already easily be done by taking a picture off someones social media.
Due to having so many people trying to impersonate me on the internet, I’ve become somewhat of a expert on verification pictures.
You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.
- The object over her right shoulder (your left), for example, looks like if someone blended a webcam with a TV with a nightstand.
- Over her left shoulder (your right), her chair is only on that one side and it blends into the counter in the background.
- Is it a table lamp or a wall mounted light?
- The doorframe in background behind her head is not even aligned.
- Her clavicles are asymmetrical, never seen that on a real person.
- Her wispy hairstrands. Real hair don’t appear out of thin air in loops.
The point isn’t that you can spot it.
The point is that the automated system can’t spot it.
Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?
The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.
but near impossible for AI generators.
…I feel like this isn’t the first time I heard that statement before.