• state_electrician
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

    • baconisaveg@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I’ve seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

    • AnotherDirtyAnglo@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.