I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.
It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people’s pictures.
These sites (which I won’t even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.
The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They’re making it so you don’t need to know what you’re doing to participate.
And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.
It is now many orders of magnitude easier than it ever has been in history to sexually exploit people’s photographs. That’s a big deal.
If you wanna pay for that then you do you. lol
But at that point you could’ve also paid a shady artist to do the work for you too.
Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way. That’s just weird, just like this whole IG attention whoring of people nowadays. And no, this isn’t even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).
This hypocritical fake outrage is just embarrassing.
These services don’t even cost anything, they’re just loaded with ads. Do you understand how the Internet works?
Ad hominem.
And they do since they’re either token or subscription based. You can’t really finance the traffic & server load with just ads unless you have some very amateurish model that will look completely ridiculous, which brings us once again back to 1 minute photo editing jobs that would look better.
And straight to victim blaming, on an issue that affects women orders of magnitude more than men. You go straight to implying consent for what they’re wearing and calling them whores for daring to have sexual agency.
Women can pose in whatever clothes they want online, that doesn’t give you the right to sexually violate them. You have a rapist mindset…
Go fuck yourself, you misogynistic piece of shit.
And you’ve shown your true colors even more. You don’t have a point. But enjoy me violating you, I just drew a little stick figure with your name on it, it has a tiny dick. Now go and call the police on me for nudifying and abusing you, please.
Your virtuous fake outrage is just moronic and hypocritical. Maybe learn a bit about the actual topic first before accusing people of severe crimes such as rape - especially literal victims, because that’s the real abusive thing here.
I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.
Regular editing is much easier and quicker than installing, configuring and using stable diffusion. People acting like “AI” is a 1 click solution that gets you convincing looking images have probably never used it.
It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people’s pictures.
These sites (which I won’t even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.
The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They’re making it so you don’t need to know what you’re doing to participate.
And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.
It is now many orders of magnitude easier than it ever has been in history to sexually exploit people’s photographs. That’s a big deal.
If you wanna pay for that then you do you. lol But at that point you could’ve also paid a shady artist to do the work for you too.
Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way. That’s just weird, just like this whole IG attention whoring of people nowadays. And no, this isn’t even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).
This hypocritical fake outrage is just embarrassing.
Removed by mod
Ad hominem. And they do since they’re either token or subscription based. You can’t really finance the traffic & server load with just ads unless you have some very amateurish model that will look completely ridiculous, which brings us once again back to 1 minute photo editing jobs that would look better.
And you’ve shown your true colors even more. You don’t have a point. But enjoy me violating you, I just drew a little stick figure with your name on it, it has a tiny dick. Now go and call the police on me for nudifying and abusing you, please.
Your virtuous fake outrage is just moronic and hypocritical. Maybe learn a bit about the actual topic first before accusing people of severe crimes such as rape - especially literal victims, because that’s the real abusive thing here.