• Kaldo@kbin.social
    link
    fedilink
    arrow-up
    20
    ·
    6 months ago

    That’s a very naive perspective though. We’re not blaming the guns for gun violence, it’s the people, but restricting access to guns is still the proven way to reduce gun incidents. One day when everyone is enlightened enough to not need such restrictions then we can lift them but we’re very far from that point, and the same goes for tools like “AI”.

      • Kaldo@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        Very easy time if it’s about commercial use (well, at least outside of china). Companies need to have licenses for the software they use, they have to obey copyright laws and trademarks, have contracts and permissions for anything they use in their day to day work. It’s the same reason why no serious company wants to even touch any competitor’s leaked source code when it appears online.

        Just because AI tech bros live in a bubble of their own, thinking they can just take and repurpose anything they need, doesn’t mean it should be like that - for the most case it isn’t and in this case, the law just hasn’t caught up with the tech yet.

      • fcSolar@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 months ago

        It’d be dead easy, actually. Don’t even have to actually ban it: For image generating models, every artist whose work is included in the training data becomes entitled to 5 cents per image in the training data every time a model generates an image, so an artist with 20 works in the model is entitled to a dollar per generated image. Companies offering image generating neural networks would near instantly incur such huge liabilities that it simply wouldn’t be worth it anymore. Same thing could apply to text and voice generating models, just per word instead of per image.

        • msgraves@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          6
          ·
          6 months ago

          disregarding the fact that the model learns and extrapolates from the training data, not copying,

          have fun figuring out which model made the image in the first place!