• 0 Posts
  • 11 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle






  • Fully on board with that. It’s why in journalism you see an indicator of closeness make it a more relevant source description. Like “democratic senator”, “someone close to the president” etc. Moreover you have to question the publishers alignment and dedication to truthfullness.

    But if people lack the critical reading skill to already mistake “unverified” with “anonymous source [of function/closeness to the subject] according to [insert news agency]”, that is just trying to find truth in a statement ment to give you doubt.

    Edit: On alignment of the publisher: “Newsmax TV holds a conservative political stance, broadcasting many programs hosted by conservative media personalities. CEO Christopher Ruddy has compared the network to Fox News.”

    Fox News itself said not to consider it actual news reporting.

    Why would a reliable source close enough to the president to know the truth about campaign aspirations go to a Fox News clone?





  • Would expect a lot of models to struggle with making the pope female, making the pope black, or making a black female a pope unless they build in some kind of technique to make replacements. Thing is, a neural net reproduces what you put into it, and I assume the bias is largely towards old white men since those images are way more readily found.

    Even targeted prompts, like a zebra with rainbow colored stripes, had very limited results 6 monts ago where there would be at least 50% non black and white stripes. I had to generate multiple times with a lot of negative terms just to get close. Currently, the first generation of copilot matches my idea behind the prompt.

    Clearly the step made was a big one, and I imagine tuning was done to ensure models capable of returning more diverse results rather just what is in the data set. It just has more unexpected results and less historically accurate images for these kind of prompts. And some that might be quite painful. Still, being always underrepresented in data sets is also quite painful. Hard to get to a perfect product quickly, but there should be a feature somewhere on their backlog to by default prevent some substitutions. Black, female popes when requesting a generated pope? To me that is a horizon broadening feature. Black, female nazis when requesting nazis? Let that not be a default result.