• queermunist she/her@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    Unless it doesn’t accurately represent the topic, which happens, and then a researcher chooses not to read the text based on the chatbot’s summary.

    Nirvana fallacy.

    All these chatbots do is guess. I’m just saying a researcher might as well cut out the hallucinating middleman.