petsoi to Linux@lemmy.ml · 2 days agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up176arrow-down116
arrow-up160arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi to Linux@lemmy.ml · 2 days agomessage-square19fedilink
minus-squaredinolinkfedilinkEnglisharrow-up10·edit-22 days agoI tried it briefly, but its hot garbage if you dont have potent hardware. The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.
I tried it briefly, but its hot garbage if you dont have potent hardware.
The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.