None of what I write in this newsletter is about sowing doubt or “hating,” but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom — is (as I’ve said before) unsustainable, and will ultimately collapse. I also fear that said collapse could be ruinous to big tech, deeply damaging to the startup ecosystem, and will further sour public support for the tech industry.

Can’t blame Zitron for being pretty downbeat in this - given the AI bubble’s size and side-effects, its easy to see how its bursting can have some cataclysmic effects.

(Shameless self-promo: I ended up writing a bit about the potential aftermath as well)

  • corbin@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 months ago

    Hallucinations — which occur when models authoritatively states something that isn’t true (or in the case of an image or a video makes something that looks…wrong) — are impossible to resolve without new branches of mathematics…

    Finally, honesty. I appreciate that the author understands this, even if they might not have the exact knowledge required to substantiate it. For what it’s worth, the situation is more dire than this; we can’t even describe the new directions required. My fictional-universe theory (FU theory) shows that a knowledge base cannot know whether its facts are describing the real world or a fictional world which has lots in common with the real world. (Humans don’t want to think about this, because of the implication.)