Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • Laser@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.

    This however isn’t as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don’t need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that’s 624Hz (the latter being the “cinematic” frequency). My monitor for example is 75 Hz which is 1.550Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.

  • astraeus@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    60Hz was the original clock rate, determined by US power cycles way back in the day. This was 50Hz in some countries.

    With LCD screens, the potential for higher frame rates became easier to achieve. They began to advertise 120Hz TVs and monitors, which set a new bar for frame rates. Some advertise 75Hz monitors, slightly better than 60Hz when crunching numbers. 75Hz is achieved by overclocking standard 60Hz control boards, most can achieve this refresh rate if they allow it. Later HDMI standards, DisplayPort and DVI-D support this frame rate at least up to 2K.

    144Hz is the same trick as 75Hz, this time with a 120Hz control board. The true standard frame rate is 120Hz, it is clocked higher to achieve 144Hz. Why 144 exactly? This was most likely due to the lack of standards that originally supported higher frame rates. Dual-link DVI-D was the only one which could push 144Hz at 1080p. Any higher frame rate (or resolution) and the signal would exceed bandwidth. Now 144Hz is simply a new standard number and plenty of 1440p monitors are set to this frame rate.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      Just to point out. I had 120hz on a CRT monitor back in the late 90s/early 2000s. The resolution was terrible though (either 640x480 or 800x600). At good resolutions (1024x768 or 1280x960) you were generally stuck with 75 to 90 at best.

      60hz LCD screens were one of the reasons there was resistance among game players to move to LCD. Not to mention earlier units took a VGA input and as such the picture quality was usually bad compared to CRT and added latency. People buying LCDs did it for the aesthetics when they first became available. Where I worked, for example only the reception had an LCD screen.

      Also, on a more pedantic point. 50hz is the power line frequency in the majority of the world.

      • Meho_Nohome@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        11 months ago

        That proves that the USA is 10 better than the rest of the world.

        (Except American Samoa, Anguilla, Antigua, Aruba, Bahamas, Belize, Bermuda, Brazil, Canada, Cayman Islands, Colombia, Costa Rica, Cuba, Dominican Republic, Ecuador, El Salvador, Guam, Guatemala, Guyana, Haiti, Honduras, South Korea, Mexico, Micronesia, Montserrat Islands, Nicaragua, Okinawa, Palmyra Atoll, Panama, Peru, Philippines, Puerto Rico, St. Kitts & Nevis Islands, Saudi Arabia, Suriname, Tahiti, Taiwan, Trinidad & Tobago, United States (USA), Venezuela, Virgin Islands, and western Japan)

  • CubbyTustard@reddthat.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    OP the top comment of this reddit thread sums it up pretty nicely

    60 because that’s the frequency of the North American power grid which became the timing source for analog television. See this video from Technology Connections for details about how that worked. It was the standard, and standards die hard. 120 because it’s 60 doubled, the next logical step. As a bonus, it’s divisible by 24, so you can watch cinema-standard 24 fps content without some frames being on screen longer than others (which was an issue on analog TV, see 3:2 pulldown). 144 because it’s bigger than 120 for marketing, and the next number divisible by 24. 240 - 120 doubled. You can probably spot the pattern.

    There are monitors with different refresh rates, like 75, 165, 175, 200… because we don’t need strict TV or cinema compatibility anymore, but the existing standa

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    72 Hz was used as a refresh rate for CRT monitors back in the day. Specifically because it was the average threshold that no users reported discomfort from CRT flicker. And 72 * 2.

    It is likely a holdover from that era. I think from there, it is a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.

    • astraeus@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      144Hz is not a holdover in the case of computer monitors. It’s the maximum bandwidth you can push through DVI-D Dual-link at 1080p, which was the only standard that could support that refresh rate when they began producing LCD monitors built to run 144Hz.

    • ZephrC@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln’t see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    The 60 rule is actually based on a 59.97 rule and it’s not really a rule, just a standard that stuck around. What’s better than 60? Two times 60! What’s better than two times 60? Four times sixty!

    With VRR you can run certain screens that get sold right now at exactly 91.3 fps if you want, it’s just extremely unpractical.

    CRT monitors actually used to run at higher refresh rates (120Hz CRTs came way before anything close to flat panels were introduced) but the shitty limitations of the first ten years of flat panels changed the way displays were used and marketed.

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    I have 160Hz screen.

    Also, 144/24=6. 24fps is the original fps of the movies. So, 160 is more puzzling from this perspective. It is not divisible by 24 or 30.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    the numbers are a maximum and software can alter it lower or split it up. I worked in a visualization lab and we would often mess with the refresh rates. That being said you could alter it and the screen would not respond (show an image) so there must be some limitations.

  • VR20X6@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    ITT: A ton of people who think computer displays can only sync at a single clockrate for some reason.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      arrow-up
      0
      ·
      11 months ago

      Fun fact, quite a few monitors can be overclocked simply by creating a custom resolution. I have a 32" Thinkvision that officially only supports 1440p 60hz but it’s fine running at 70hz when asked to.

  • MeanEYE@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    With computer displays only limitation is hardware. If I had to hazard a guess, 144Hz is there because that’s approximately maximum supported on widest range of hardware and 144Hz crystals were widely available and therefore cheap. Kind of how there’s a huge market for rollerblade ball bearings. Pretty much all of the power tools are using them. They are simply everywhere because they are cheap.