Nazrin@burggit.moe to AI discussions@burggit.moeEnglish · 1 year agoBeeeg GPT model (176B) (94GB) (150GB of vram needed)huggingface.coexternal-linkmessage-square9fedilinkarrow-up18arrow-down10
arrow-up18arrow-down1external-linkBeeeg GPT model (176B) (94GB) (150GB of vram needed)huggingface.coNazrin@burggit.moe to AI discussions@burggit.moeEnglish · 1 year agomessage-square9fedilink
minus-squaremoyi@burggit.moelinkfedilinkEnglisharrow-up2·1 year agowe don’t even have gpus with vram over 100gb, let alone affordable ones above 8.
minus-squareNazrin@burggit.moeOPlinkfedilinkEnglisharrow-up2·1 year agoThe model recommends 2 x 80GB or 3 x 48GB GPUs
we don’t even have gpus with vram over 100gb, let alone affordable ones above 8.
The model recommends 2 x 80GB or 3 x 48GB GPUs