Arch and other Linux operating systems Beat Windows 11 in Gaming Benchmarks::ComputerBase benchmarked three different Linux operating systems and found that all three can achieve better gaming performance than Windows 11.

  • the_q@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    ·
    10 months ago

    I don’t understand why new Linux folks immediately go for Arch-based distros and insist on using Nvidia GPUs. Like, are you guys into suffering or something?

    • kameecoding@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      10 months ago

      I used endeavorOS (basically pure Arch with a GUI installer) and I have had 0 issues with Nvidia GPUs, in fact it was a smoother experience than anything else.

      • the_q@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        Yeah that’s a bummer, but in Windows that’s still a good card for gaming/CUDA. Nvidia, unfortunately, is a lot like Apple. They do have some neat tech, but they lock it behind both price and exclusivity. That’s great for C-Suite pockets, but very anti-consumer at its core.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      When I started using Linux circa 2008 Nvidia was the way. ATI/AMD would never work as well. Fast forward and I still use Nvidia because of cuda cores and davinci resolve for video editing. I’ve just been on the Nvidia card game for a long time. I have no problems with it still.

      As for arch base, I started that in 2015. Just found it more flexible and AUR is awesome. So much more software that I could not get on a debian based system.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      I never understood what you people do with your machines, through the years i must have used at least half a dozen nvidia gpus and never had any real issues.

      Of course early on you had to compile your drivers in the kernel yourself, but then I’m not even sure ati had drivers at the time. And that’s how you configured the kernel anyway.

      • the_q@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        I think a lot of users kinda jump in the deep end, which is fine, but expect their experience to be flawless. Then when it inevitably isn’t they get upset and disheartened. I get that.

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          Why I eventually settled on Ubuntu. I did Red Hat 5 in the 90s, built a Linux From Scratch system, and daily drove Gentoo for a number of years. Got sick of solving NP-complete problems in Gentoo package management. Combine that with lots of documents saying “this is how it works in Ubuntu, and everywhere else you’ll have to figure it out for yourself”. I don’t have time for that shit.

          Hell, Ubuntu is more straightforward to get TensorFlow working with Nvidia GPUs than it is on Windows. Nobody uses TensorFlow on pure Windows; you want to use WSL. To do that, you have to setup a passthrough layer to give WSL direct access to the GPU. There have been like three ways to do that over the years, and if you hit the wrong instructions in Google, you’ll have to back out everything you did and make sure you start again clean. Which might mean a full reinstall. On Linux, you install the Nvidia drivers, install TensorFlow with the GPU flags, and you’re done.

    • Samueru@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I moved from a gtx 1060 to an RX580 and it has been terrible, recording in obs is horrible to the point that the cpu yields better results and now a recent kernel version broke the power meter on all the polaris gpus.

      • Goodvibes@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Damn, sorry to hear that, my experience with the 480 was really good. Admittedly AMD wasn’t quite caught up yet with hardware video encoding at the time that card was designed (basically a reskinned 480). Specifically, hardware video encoding has gotten drastically better since then on AMD cards.