Old but gold. posting for anybody who hasn’t seen this yet.

  • chaorace@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    1 year ago

    I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly? With AMD, the 3D acceleration driver is bundled directly into VESA, so it’s already ready & working before even the first-boot of almost all desktop distros. That’s how drivers are supposed to work on Linux and it has taken NVIDIA 10+ years (and counting…) to get with the basic program.

    I applaud the long overdue decision to move their proprietary firmware directly onto the card and making the rest of the kernel driver open-source, but I’ll remind you folks of a few things:

    • The open source driver is still in an alpha with no timeline for a stable release
    • NVIDIA has so far elected to control their own driver releases instead of incorporating 3D acceleration support into VESA

    NVIDIA had to be dragged screaming to go this far and they’re still not up to scratch. There’s still plenty of fuel left in the “Fuck NVIDIA” gastank.

    • Fryboyter
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly?

      Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.

      By the way, I am currently using a 6800 XT from AMD. I therefore don’t want to defend Nvidia graphics cards across the board.

      Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.

      Since then, some things have changed and Nvidia has contributed code to several projects like Plasma or Mesa to improve the situation regarding Wayland.

      • chaorace@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.

        Kernel modules work until they don’t. I’m genuinely glad that you’ve had a good experience and – despite appearances – I’m not interested in provoking a vendor flamewar… but the fact remains that among the three major patterns (builtin, userland, module), modules are the most fragile and least flexible. I’ll cite this response to my parent comment as an example.

        Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.

        That’s a fair point, but to a certain extent I think this overlooks the importance of developer sentiment on a project like Linux. Take (Intel) Macbooks as an extreme example: kernel developers liked the hardware enough to support it despite utter vendor indifference. It’s clearly a case of hypocrisy compared to NVIDIA who (at the very least) participates, but at the end of the day people will show love for the things that they love. NVIDIA remains unloved and I do feel that this bleeds through to the user experience a fair amount.

        In any case, you’re right to say that legitimate criticisms are often blown out of proportion. Developer problems aren’t necessarily user problems, even if we sometimes romanticize otherwise.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The problem is that Nvidia’s software stack is much more advanced. For example machine learning acceleration, CUDa is miles better than Rocm and widely supported. I wish AMD were more serious about GPUs and made greater strides, but they overslept and let Nvidia become a de-facto monopolists with their anti competitive/consumers strategies and closed source strides.

      Nvidia is the new Apple, unfortunately.

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I’m messing with shitvidia now on a new AAA laptop after people said it just works. I just spent all day trying to setup EFI keys for secure boot because shitvidia doesn’t sign their kernel drivers module. Plus their drivers are outdated and their documentation is terrible. I failed today because Gigabyte is another shit company that has a proprietary (theft) bootloader set so that no one can lock the UEFI secure boot with any PK except theirs. I can run Fedora’s key issued by Microsoft to run it with secure boot enabled, but then I can’t use the god dam GPU I bought the piece of shit for in the first place. Shitvidia will always be shitvidia. This proprietary bullshit is straight up theft. It should be illegal to sell anything with digital reservations of any kind. Dealing with all this, I think Stallman was ultra conservative right wing. Fuck these criminals.

      • De Lancre@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You other option would be use amd iGPU. Cause good luck find amd discrete gpu in notebook this days. And even then, you would be just “messing with shit-amd”.

      • De Lancre@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You basically have two option: suffer on nvidia, cause some feature may not be developed, or suffer on amd, cause developed feature just straight up do not work.

  • De Lancre@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    6
    ·
    edit-2
    1 year ago

    Honestly, you can downvote me for my opinion, but when we talking about current support from vendors and if you just wanna play damn games — nvidia just works.

    Yes, nvidia lack of support for some features, or sometime they have their time to implement it, like egl for wayland support for example, but god damn, when we talk about smth more simple as playing games, nvidia is just better. You can literally stick bought card in, install blob driver and play. (On notebooks there a bit more hassle and a lot of stuff may not work, like sleep or auto poweroff of gpu for lower power consumption, but good luck find competitor nowadays, lol)

    I have 7900xtx, and it’s fucking pain in the ass. Two (three technically) vulkan drivers, mesa need to be up-to-date to use smth like RT (and it’s still will suck, cause they just started working on RT support like month ago), downvolting do not work and probably will never work, according to some redditor who into amdgpu developing, clock control do not work, some card cant be controled by TDP, there a problem on wayland with VRR, there a two years old bug [1] [2], that cause memclock to stuck to maximum or minimum depending of your display refresh rate: imagine having 7900xtx and get like 20% of it performance, cause gpu don’t feeling like playing today. Oh, and you cant control RGB on the card yet, but that small inconvenience, and soon should be implemented, cause that lack of feature from openRGB, rather then kernel problem. Upd. Last one is a kernel problem, as pointed out for me by user below. Oh well.

    • CalcProgrammer1@lemmy.ml
      link
      fedilink
      arrow-up
      20
      ·
      edit-2
      1 year ago

      The RGB control is a kernel problem not an OpenRGB problem (well, it might also be an OpenRGB problem if the card doesn’t work in Windows either). The amdgpu kernel driver doesn’t expose the i2c interfaces not associated with display connectors, so the i2c interface used for RGB is inaccessible and thus we can’t control RGB on Linux. AMD’s ADL on Windows exposes it just fine.

      That said, I can’t agree that NVIDIA just works. Their drivers are garbage to get installed and keep updated, especially when new kernels come out. Not to mention the terrible Wayland support and lack of Wayland VRR capability. I’m happy with my Arc A770 (whose RGB is controlled over USB and just works, but requires a motherboard header).

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The RGB control is a kernel problem not an OpenRGB problem

        Sorry, rechecked it and yes, you right. [link] Oh well, another one to long list of what do not work as should on amdgpu side, I guess.

      • ProtonBadger@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Their drivers are garbage to get installed and keep updated, especially when new kernels come out

        Sure, but it’s not the case for all Linux distributions? Whenever my Linux distribution have a new kernel it always takes care of the nvidia driver as part of installing the kernel and if there’s a new nvidia driver it installs it after a few days, I never pay much attention to it except for noticing the output from the update.

        • Vilian@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          except when using newer kernels and the nvidia gpu not being updated enough

    • nous@programming.dev
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      Back when this statement was made - 11 years ago - nvidia were a lot worst, especially for the kernel developers. A lot has change, and improved in those 11 years.

      But people still like to hang on to the old hate and don’t see or want to see any progress being actually made. I an fairly sure that Linus even said they were not as bad as they used to be. But I cannot find that quote amongst all the results for that one angry statement he made - people and media much prefer to hate on things than actually see things improve.

      • MonkRome@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        While I agree, there are other reasons to hate on them even if they improved in one place… Deceptive marketing, melting cards, poor vender management, etc

        • nous@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Yeah, but their competitors are not doing much better in those regards either. The whole graphics card industry is doing shitty stuff, hell most mega corps are these days.

    • ghariksforge@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      This is from 10 years ago. Nvidia sucked those days. The demand from machine learning changed all that and forced Nvidia to go open source.

      • CalcProgrammer1@lemmy.ml
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        NVIDIA never really went open source…they opened up their kernel drivers to a degree (by moving the majority of the interesting bits into the GPU firmware at that) but the userspace portion (Vulkan, OpenGL, OpenCL, CUDA, etc) is still very much closed source.

    • gens@lemmy.fmhy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I got myself an expensive-ish rx 480 because nvidia said they will support vulkan on it. They never implemented vulkan for fermi. Never again will i buy from greedy liars.

      Linus is talking about a different thing entirely. And while their drivers were always great, there is much more to the story then just how well they render 3d.

    • GreyBeard@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’ve got a 7900xt and idle power draw and heat generation is off the charts, so I must agressively sleep my computer when not in use. I’ve been hoping for an update to fix it, but nothing yet. And this isn’t really AMDs problem, but a lot of AI stuff just isn’t possible on RDNA 3, because the python libraries don’t support it. Some library updates have started supporting it, but often the tools to make the models work uses old library versions.

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Sorry for late response, only notice you right now. For me, idle power draw is about 30w (60w if mem clock bagged out on high clock) on card. It’s worse than it should be (without memclock bug it’s about ~17w), but doable. If you have higher power draw, probably smth else broke.

        • GreyBeard@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I’d have to pull out my kill-a-watt to get an accurate reading, but my house grid increases by about .2-.3kw when my PC is on. That doesn’t count all my monitors and whatnot. It is a noticable drain on my houses grid at idle.

    • Vilian@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      the TDP and the 2 years old bug that are being reported more and more as fixed in latest kernels?

  • n33rg@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I recall this from around the time I basically gave up dealing with Linux and Nvidia chips. At that time, I felt I couldn’t agree more. Has this improved in recent years at all? With Nvidia getting more into data centers as their focus, I figure Linux has to be a focus, no?

    • nous@programming.dev
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      1 year ago

      Things have improved a lot in the 11 years since Linus made that statement. For the most part Nvidia drivers just work. They have even released some opensource drivers fairly recently that will go a long way to making things even better, especially for the existing OSS drivers.

      But they are still not perfect. They do keep doing their own thing for years before finally giving up and doing what everyone else is. The latest of which was the EGL Streams API nvidia created for wayland where everyone else was using GBM APIs which forced many window managers and applications to have to explicitly support nvidia drivers. But they have now switched to GBM like everything else (though last I heard their implementations were far from bug-free, things have likely improved since then those as that was over a year ago now).

    • redcalcium@c.calciumlabs.com
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Nvidia cards are mostly working fine these days as long as you’re not using Wayland. If you’re using Wayland, be prepared to encounter lots of minor annoyances, and perhaps some bugs that completely break your workflow depending what you’re using Linux for (e g. on server you don’t have to deal with sleep issues, but in desktop it’s an annoyance while on laptop it might be a deal breaker).

    • Zamundaaa
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      There are still major deal breakers, like missing synchronization between apps, Xwayland and the compositor, and there’s plenty of missing or broken features for Wayland compositors. The situation is steadily improving, but it will still be a while before you can make the assumption that there will generally be no major issues for most people.

      With Nvidia getting more into data centers as their focus, I figure Linux has to be a focus, no?

      Sure, but only in regard to data center features. Things like having a smooth desktop experience are not relevant for that, so they only get limited attention.

  • SapienSRC@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    I recently realized, while dealing with some screen flickering with the most recent Nvidia drivers, that I had never used Linux without a Nvidia GPU. I’ve always had them in my computer so I always installed the driver. Lately I play mostly older games so I decided to remove the GPU and let my i9 sort out the graphics.

    When I say it was a NIGHT AND DAY difference in overall quality I’m not kidding. Everything was buttery smooth and any lingering thoughts of missing Windows faded away. Honestly felt like I bought a new computer.

    Now I’ve decided to sell my Nvidia GPU on eBay and either grab an AMD card or be bold and pick up an Intel Arc 750.

    So in short, to echo Linus himself, fuck Nvidia.

    • Sam@lemmy.caOP
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I’ve only used one AMD card with Linux and it was so smooth I never thought about it. Lately I’ve been using nvidia for one year and I’m losing my sanity with it. Switching back to AMD next week.