• wia@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    There kinda isn’t really any definitive science that indicates a specific frame rate that the eye can perceive.

    There are studies however that show ranges from 30 to 90hz, and studies that show that human perception can detect flicker at up to 500hz even.

    The issue is that nothing that happens in the real world is synchronized with what you perceive. So filling in with more Hz means there are more chance for you to actually perceive the thing.

    To complicate matters further, our brains do a lot of filling in for us, and our eyes and brains can still perceive things you aren’t consciously perceiving yourself. So again more frames is always nice.

    Here are some sources

    Canadian Centre for Occupational Health and Safety. (2020). Lighting Ergonomics - Light Flicker.
    https://www.ccohs.ca/oshanswers/ergonomics/lighting_flicker.html

    Davis J, et al. (2015). Humans perceive flicker artifacts at 500 Hz.
    https://doi.org/10.1038/srep07861

    Mills M. (2020). How Many Frames per Second (FPS) the Human Eye Can See.
    https://itigic.com/how-many-frames-per-second-fps-human-eye-can-see/

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      If you’re using a game that renders each frame at an instant in time, and the aim is to get a better approximation of true motion blur to your eye, the theoretical maximum for getting smoother motion blur is gonna be when the thing is moving at one pixel a second, which is higher than the rate at which we can distinguish between individual images. Well, okay, maybe a bit more, since you could hypothetically have sub-pixel resolution.

      But point is, more rendered frames does buy you something even past the point that they’re not individually distinguishable, unless the game’s rendering engine can render perfectly-accurate motion blur itself.