A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

    • Saledovil@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      2 months ago

      Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

    • emmy67@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 months ago

      It didn’t generate what we expect and know a corn dog is.

      Hence it missed because it doesn’t know what a “corn dog” is

      You have proven the point that it couldn’t generate csam without some being present in the training data

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        I hope you didn’t seriously think the prompt for that image was “corn dog” because if your understanding of generative AI is on that level you probably should refrain from commenting on it.

        Prompt: Photograph of a hybrid creature that is a cross between corn and a dog

        • emmy67@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          Then if your question is “how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?”

          I’d honestly say, i don’t know.

          And if you’re honest, you’ll say the same.

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

            This is because it doesn’t need to have been seen one before. It knows what corn looks like and it knows what a dog looks like so when you ask it to combine the two it will gladly do so.

            • emmy67@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 months ago

              But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

              Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄

              • ContrarianTrail@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                And this proves that AI can’t generate simulated CSAM without first having seen actual CSAM how, exactly?

                To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.

                • emmy67@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 months ago

                  I wasn’t the one attempting to prove that. Though I think it’s definitive.

                  You were attempting to prove it could generate things not in its data set and i have disproved your theory.

                  To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.

                  To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don’t believe me?

                  Here’s a link to it

                  • ContrarianTrail@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    edit-2
                    2 months ago

                    You were attempting to prove it could generate things not in its data set and i have disproved your theory.

                    I don’t understand how you could possibly imagine that pic somehow proves your claim. You’ve made no effort in trying to explain yourself. You just keep dodging my questions when I ask you to do so. A shitty photoshop of a “corn dog” has nothing to do with how the image I posted was created. It’s a composite between a corn and a dog.

                    Generative AI, just like a human, doesn’t rely on having seen an exact example of every possible image or concept. During its training, it was exposed to huge amounts of data, learning patterns, styles, and the relationships between them. When asked to generate something new, it draws on this learned knowledge to create a new image that fits the request, even if that exact combination wasn’t in its training data.

                    Cause we have actual instances and many where csam is in the training data.

                    If the AI has been trained on actual CSAM and especially if the output simulates real people, then that’s a whole another discussion to be had. This is however not what we’re talking about here.