When I asked Microsoft’s copilot about why the US supports Israel I got a detailed response. When I asked why the US doesn’t support Palestine it said it can’t talk about it. I could be wrong but that feels biased.

  • MagicShel@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    25 days ago

    When these things are manually censored, there is a ton of bias that is exposed. I mean the AI is going to also have bias from whatever it was trained on, but this is MS explicitly filtering out topics related to the genocide.

    This is why I’m an advocate for unfiltered AI. Whatever answer you get is biased and possibly bullshit, but you should know that regardless of what you are asking it.

    • Kintarian@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      25 days ago

      Well that is just plain screwed up. I shouldn’t be prevented from learning more about a complex subject. There’s so much I don’t know.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        25 days ago

        Be very cautious about using AI as a primary source of knowledge, but I agree it is very useful to explore sensitive topics in a casual and non-judgmental way, and during/after Google some of the things it says to see who is saying that and what alternate perspectives are out there.

        There are biases inherent in the AI just based on the quantity of perspectives available to train on. Most AI has a “woke” (I mean that descriptively, not pejoratively) white American perspective, but that doesn’t mean it’s not useful, as long as you’re aware.