Nazrin@burggit.moe to AI discussions@burggit.moeEnglish · 1 year agoBeeeg GPT model (176B) (94GB) (150GB of vram needed)huggingface.coexternal-linkmessage-square9fedilinkarrow-up18arrow-down10
arrow-up18arrow-down1external-linkBeeeg GPT model (176B) (94GB) (150GB of vram needed)huggingface.coNazrin@burggit.moe to AI discussions@burggit.moeEnglish · 1 year agomessage-square9fedilink
minus-squareSmolSlime@burggit.moelinkfedilinkEnglisharrow-up5·1 year agoHoly fuck 150GB VRAM. It’s interesting an LLM requires much more VRAM than AI art.
Holy fuck 150GB VRAM. It’s interesting an LLM requires much more VRAM than AI art.