moyi@burggit.moetoAI discussions@burggit.moe•Beeeg GPT model (176B) (94GB) (150GB of vram needed)English
2·
1 year agowe don’t even have gpus with vram over 100gb, let alone affordable ones above 8.
we don’t even have gpus with vram over 100gb, let alone affordable ones above 8.
it always amazes me on how tech can become just objectively better and in smaller doses just in small march of time.