• RBG
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 months ago

    Is that feasible on a Raspberry pi?

    • Scew@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      9 months ago

      No, lol. Well, at least I’m not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

      The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

      Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

      Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they “distilled” theirs from… Not really sure what exactly they think they are differentiating themselves from, reading the article…

    • Wooki@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Lol read the article, it cites “8gb vram” and if i had to guess it will only support nvidia out of the gate