From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren’t ‘too far gone’ to reconsider their convictions and change their minds.

  • davidgro@lemmy.world
    link
    fedilink
    arrow-up
    49
    ·
    2 months ago

    Anything can be used to make people believe them. That’s not new or a challenge.

    I’m genuinely surprised that removing such beliefs is feasible at all though.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      2 months ago

      If they’re gullible enough to be suckered into it, they can similarly be suckered out of it - but clearly the effect would not be permanent.

      • Zexks@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        That doesn’t follow with the “if you didnt reason your way into a believe you can’t reason your way out” line. Considering religious ferver I’m more inclined to believe this line than yours.

        • Azzu@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          2 months ago

          No one said at all that AI used “reason” to talk people out of a conspiracy theory. In fact I would assume it’s incredibly unlikely since AI in general is not reasonable.