• The Snark Urge@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    It’s not that interesting. If you rephrase the question as a choice between a good option and a less good one, it’s still barely even a choice.

    “Would you rather have only one (or, say, trillions) die now, or would you like to allow *at a minimum *twice that many people die the second we talk to a sadist?”

    If you can’t choose the smaller number, all it means is that you lack moral strength - or the test proctor has put someone you know on the tracks, which is cheating. A highly principled person might struggle if choosing between their daughter and one other person. If it’s my kid versus a billion? That’s not a choice, that’s just needless torture. Any good person would sacrifice their kid to save a billion lives. I take that as an axiom, because anything else is patently insane.

    • apollo440@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Kill fewer people now is obviously the right answer, and not very interesting.

      What is interesting is that the game breaks already at junction 34, which is unexpectedly low.

      So a more interesting dilemma would have been “would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.”. Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes “do I trust the next 33-n people not to be psychos, or do I limit the damage now?”. Even more interestingly “limiting the damage now” makes you the “psycho” in that sense…