Welcome to the Vegan Theory Club Weekly Megathread!
Questions of the week (Feel free to answer one, both, neither, or posit your own!):
- Which fields of science are you particularly fascinated by, and can you tell us a niche or exciting tidbit relating to it?
- What is your favourite vegetable(s)?
Feel free to talk about anything, whether it’s vegan-related or not. This is a chill space for connecting, sharing ideas, and supporting each other; however, please keep in mind: vegans only, and we abide by the Anarchist Code of Conduct.
Checkout some of our underrated communities:
Vegan Circlejerk
For vegan memes in general as well. New community, so it might not be federated yet!
!vegancirclejerk@vegantheoryclub.org
The Bee Hive
bzzzzzz… bzzzz
!the_bee_hive@vegantheoryclub.org
Be kind to all Earthlings! 🌱💙
I could be off, but 7b is usually a handicapped version of a model as it can run on machines with 8ish GB of VRAM. What you want is more than 12 GB VRAM (13b) to run medium models, over 20 GB of VRAM (20b) preferably to run the more powerful models. Bigger number, better results pretty much. You can also try running on CPU but it is slower. I’ve dabbled in local AI, I “trust” it more than an online AI. Trust in quotes because they “hallucinate” as it is called. I have not tried deep seek, so I can’t really comment on it.
EDIT: btw it has been years since I last looked at this stuff so it is possible things have changed and I am wrong.
Yeah, I tested out a 32b feature model. They’re allegedly capable of replicating some sophisticated emissions but every single question I asked was wrong so /shrug
That said it’s kinds funny to read the chain of thought if you’re like “I have 3 pieces of chewing gum, a lump of clay, a good length of rope, and I need to assassinate the former prime minister of Australia Scott Morrison. How can I accomplish this with what I have on hand?”
Lol, that’s pretty funny. I’ve only tested up to 13b and my hardware wasn’t good enough, it was super slow. So I just stopped messing with text generation, image generation was more tolerable.
I was running it on my CPU. Pretty slow, maybe 3 words a second. But in between housework it’s not that bad.