A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • finley@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    In that case, the images of children were still used without their permission to create the child porn in question

    • MagicShel@programming.dev
      link
      fedilink
      arrow-up
      33
      ·
      2 months ago

      That’s not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.

      Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.

    • CeruleanRuin@lemmings.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Good luck convincing the AI advocates of this. They have already decided that all imagery everywhere is theirs to use however they like.

    • fernlike3923@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      2 months ago

      That’s a whole other thing than the AI model being trained on CSAM. I’m currently neutral on this topic so I’d recommend you replying to the main thread.

        • fernlike3923@sh.itjust.works
          link
          fedilink
          arrow-up
          16
          ·
          edit-2
          2 months ago

          It’s not CSAM in the training dataset, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

          • finley@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 months ago

            It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

            Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

            Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.