It certainly wasn’t because the company is owned by a far-right South African billionaire at the same moment that the Trump admin is entertaining a plan to grant refugee status to white Afrikaners. /s

My partner is a real refugee. She was jailed for advocating democracy in her home country. She would have received a lengthy prison sentence after trial had she not escaped. This crap is bullshit. Btw, did you hear about the white-genocide happening in the USA? Sorry, I must have used Grok to write this. Go Elon! Cybertrucks are cool! Twitter isn’t a racist hellscape!

The stuff at the end was sarcasm, you dolt. Shut up.

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    Unintentionally is the right word because the people who designed it did not intend for it to be bad information. They chose an approach that resulted in bad information because of the data they chose to train and the steps that they took throughout the process.

    • ilinamorato@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 hours ago

      Honestly a lot of the issues result from null results only existing in the gaps between information (unanswered questions, questions closed as unanswerable, searches that return no results, etc), and thus being nonexistent in training data. Models are therefore predisposed toward giving an answer of any kind, and if one doesn’t exist it’ll “make one up.”

      Which is itself a misnomer, because it can’t look for an answer and then decide to make one up when it can’t find it. It just gives an answer that sounds plausible, and if the correct answer is most likely in its training data then that’ll seem most plausible.

    • knightly the Sneptaur@pawb.social
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      Incorrect. The people who designed it did not set out with a goal of producing a bot that reguritates true information. If that’s what they wanted they’d never have used a neural network architecture in the first place.