petsoi to Linux@lemmy.ml · 1 day agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square16fedilinkarrow-up169arrow-down116
arrow-up153arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi to Linux@lemmy.ml · 1 day agomessage-square16fedilink
minus-squaredinolinkfedilinkEnglisharrow-up10·edit-21 day agoI tried it briefly, but its hot garbage if you dont have potent hardware. The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.
I tried it briefly, but its hot garbage if you dont have potent hardware.
The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.