• brsrklf@jlai.lu
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 day ago

    Only yesterday, I searched for a very simple figure, the number of public service agents in a specific administrative region. This is, obviously, public information. There is a government site where you can get it. However I didn’t know the exact site, so I searched for it on Google.

    Of course, AI summary shows up first, and gives me a confident answer, accurately mirroring my exact request. However the number seems way too low to me, so I go check the first actual search result, the aforementioned official site. Google’s shitty assistant took a sentence about a subgroup of agents, and presented it as the total. The real number was clearly given before, and was about 4 times that.

    This is just a tidbit of information any human with the source would have identified in a second. How the hell are we supposed to trust AI for complex stuff after that?

    • Patch@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 hours ago

      The AI models can be hilariously bad even on their own terms.

      Yesterday I asked Gemini for a population figure (because I was too lazy to look it up myself). First I asked it:

      what is the combined population of place 1, place 2 and place 3

      It answered:

      Unfortunately, providing an absolutely precise, up-to-the-minute combined population figure is difficult. Population numbers are constantly changing.

      To find these figures, I would recommend checking the Office for National Statistics (ONS) website. They provide detailed population statistics for local authorities in England.

      On a whim, I asked it again as:

      combined population of place 1, place 2 and place 3

      And then it gave me the answer sweet as a nut.

      Apparently I was being too polite with it, I guess?

      • LinyosT@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        15 hours ago

        I slapped a picture of a chart into Gemini because I didn’t know what the type of chart was called but I wanted to mention it in a Uni report. I was too lazy to go looking at chart types and thought that would be quicker.

        I just asked it “What kind of chart is this” and it ignored that and started analysing the chart instead and started stating what the chart was about and giving insights into the chart. Didn’t tell me what kind of chart it was even though that was the only thing I asked.

        Bear in mind that I deliberately cropped out any context to avoid it trying to do that, just in case, so all I got from it was pure hallucinations. It was just making pure shit up that I didn’t ask for.

        I switched to the reasoning model and asked again, then it gave me the info I wanted.