• poVoq@slrpnk.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    This is odd reporting: Stable Diffuse XL AFAIK already runs on a GPU with 8GB ram and usually doesn’t need that much time to generate an image either (depends on the GPU).

    • Even_Adder@lemmy.dbzer0.comOP
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      I think they got their numbers wrong. It says they shrink it down to 700 million parameters, that would make it smaller than SD 1.5, which means it should take way less than 8GB of RAM.

      • Stampela@startrek.website
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        I’m guessing there’s a mix. The smallest version is 700 million, possibly the one used to generate the time data reported, but the largest (or not?) still runs with 8gb. If I remember correctly SD3 is supposed to have multiple versions, starting from 800 millions and going up, so this is going to be interesting.

  • RBG
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 months ago

    Is that feasible on a Raspberry pi?

    • Scew@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      10 months ago

      No, lol. Well, at least I’m not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

      The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

      Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

      Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they “distilled” theirs from… Not really sure what exactly they think they are differentiating themselves from, reading the article…

    • Wooki@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      10 months ago

      Lol read the article, it cites “8gb vram” and if i had to guess it will only support nvidia out of the gate