Ola

Elsewhere, I’ve been building a behaviour shaping harness for local LLMs. In the process of that, I thought “well, why not share what the voices inside my head are saying”.

With that energy in mind, may I present Clanker Adjacent (name chosen because apparently I sound like a clanker - thanks lemmy! https://lemmy.world/post/43503268/22321124)

I’m going for long form, conversational tone on LLM nerd-core topics; or at least the ones that float my boat. If that’s something that interests you, cool. If not, cool.

PS: I promise the next post will be “Show me your 80085”.

PPS: Not a drive by. I lurk here and get the shit kicked out of me over on /c/technology

  • MIXEDUNIVERS
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    that looks interessing any guides where this is in an docker compose stack with olama and open webui? i want to experiment on an i5 6th gen. mini pc.

    noob here.

    • SuspciousCarrot78@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 hours ago

      Done

      I’ll give you the noob safe walk thru, assuming starting from 0

      1. Install Docker Desktop (or Docker Engine + Compose plugin).
      2. Clone the repo: git clone https://codeberg.org/BobbyLLM/llama-conductor.git
      3. Enter the folder and copy env template: cp docker.env.example .env (Windows: copy manually)
      4. Start core stack: docker compose up -d
      5. If you also want Open WebUI: docker compose --profile webui up -d

      Included files:

      • docker-compose.yml
      • docker.env.example
      • docker/router_config.docker.yaml

      Noob-safe note for older hardware:

      • Use smaller models first (I’ve given you the exact ones I use as examples).
      • You can point multiple roles to one model initially.
      • Add bigger/specialized models later once stable.

      Docs:

      • README has Docker Compose quickstart
      • FAQ has Docker + Docker Compose section with command examples
    • SuspciousCarrot78@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 days ago

      Yes, if you mean llama-conductor, it works with Open WebUI, and I’ve run it with OWUI before. I don’t currently have a ready-made Docker Compose stack to share, though.

      https://github.com/BobbyLLM/llama-conductor#quickstart-first-time-recommended

      There are more fine-grained instructions in the FAQ:

      https://github.com/BobbyLLM/llama-conductor/blob/main/FAQ.md#technical-setup

      PS: will work fine on you i5. I tested it the other week on a i5-4785T with no dramas

      PPS: I will try to get some help to set up a docker compose over the weekend. I run bare metal, so will be a bit of a learning curve. Keep an eye on the FAQ / What’s new (I will announce it there if I mange to figure it out)