• sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    1 day ago

    I’m interested in benchmarks to compare to my current RX 6650 XT, which is pretty similar to the 4060.

    It has 12GB VRAM, which might be enough to mess around with smaller LLM models, but I really wish they’d make a high VRAM variant for enthusiasts (say, 24GB?).

    That said, with Gelsinger retiring, I’ll probably wait until the next CEO is picked to hear whether they’ll continue developing their GPUs, I’d really rather not buy into a dead-end product, even if it has FOSS drivers.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      3
      ·
      8 hours ago

      Got the same card and you can definitely run smaller models on 8GB. There’s no need to pay 200-300 bucks for a 4Gb ram upgrade though. Might be a nice card for people on the lower end but not in our cases. But yeah, I’d really like more vram too, especially with how expensive the higher end cards get - which AMD won’t even bother with anymore anyway. Really hoping for something with 16+ GB for a decent price.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 hours ago

        Yeah, I really don’t need anything higher than 6700/7700 XT performance, and my 6650 XT is still more than sufficient for the games I play. All I really need is more VRAM.

        If Intel sold that, I’d probably upgrade. But yeah, 12GB isn’t quite enough to really make it make sense, the things I can run on 12GB aren’t meaningfully different than the things I can run on 8GB.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      12GB VRAM in 2024 just seems like a misstep. Intel isn’t alone in that, but it’s really annoying they didn’t just drop at least another 4GB in there, considering the uplift in attractiveness it would have given this card.

        • circuitfarmer@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 day ago

          The industry as a whole has really dragged ass on VRAM. Obviously it keeps their margins higher, but for a card targeting anything over 1080, 16GB should be mandatory.

          Hell, with 8GB you can run out of VRAM even on 1080, depending on what you play (e.g. flight sims).

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 day ago

        I doubt it would cost them a ton either, and it would be a great marketing tactic. In fact, they could pair it w/ a release of their own LLM that’s tuned to run on those cards. It wouldn’t get their foot in the commercial AI space, but it could get your average gamer interested in playing with it.

        • Da Bald Eagul@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          It wouldn’t cost much, but this way they can release a “pro” card with double the vram for 5x the price.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            13 hours ago

            I doubt they will. Intel has proven to be incompetent at taking advantage of opportunities. They missed:

            • mobile revolution - waited to see if the iPhone would pan out
            • GPU - completely missed the crypto mining boom and COVID supply crunch
            • AI - nothing on the market

            They need a compelling GPU since the market is moving away from CPUs as the high margin product in a PC and the datacenter. If they produced an AI compatible chip at reasonable prices, they could get real world testing before hey launch something for datacenters. But no, it seems like they’re content missing this boat too, even when the price of admission is only a higher memory SKU…

        • Chewy
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 hours ago

          It likely depends on how much they pay for power and how many users they serve.

          E.g. I’d really like AV1 support on my server (helps with slow upload), but the cost for power of a dedicated GPU is inacceptable in my country. The few transcoding reams I’d theoretically need in a worst case scenario are more than met with an iGPU.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 hours ago

            Yup. In my area, power usage is a non-issue. I pay $0.12-0.13/month, so my concerns around power usage are only because I don’t want to be wasteful (our energy largely comes from coal and natural gas). So I wouldn’t buy the A380 despite it not mattering too much because it’s just too wasteful.

            This new set of cards seem to be a lot more power efficient though, so maybe they’re worth a look if you need something for transcoding.

            • Chewy
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 hours ago

              I’ve just looked it up and the A380 seems to only draw ~17W at idle. That’s better than I thought, but still 2-3 times a HDD.

              I wonder whether the new generation will lower the idle power usage too, or only the performance per watt.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                3 hours ago

                Yeah, my NAS uses something like 50W (measured from the wall), with two HDDs, a SATA SSD, Ryzen 1700, and an old GPU (750 Ti, won’t boot without graphics). I haven’t measured everything independently, but online sources say 6W for the GPU. So the A380 would be 3x higher. That doesn’t matter too much in my area, but it’s still extra power draw.

                Hopefully the new gen is close to that 6W figure.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      9 hours ago

      If that was some OEM design for a retail PC, fine. But fuck off with shit like glued back plates on dedicated GPUs you buy.

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    B770 to hypothetical B9XX is what I’m looking for. Phoenix benchmarks because not many doing Linux benchmarks. 8700-8800xt or B700-B9XX for me next year