• nutbutter
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I self-host it in a docker container. You will have to download abour 4 gigabytes of “n-gram” data. And there are no AI features in the self-hosted version.

      • nutbutter
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Here’s my podman-compose.yml file (you can use this as you docker-compose file as well):

        version: "3"
        
        services:
          languagetool:
            image: erikvl87/languagetool:latest
            container_name: languagetool
            ports:
                - 8010:8010  # Using default port from the image
            environment:
                - langtool_languageModel=/ngrams  # OPTIONAL: Using ngrams data
                - Java_Xms=512m  # OPTIONAL: Setting a minimal Java heap size of 512 mib
                - Java_Xmx=1g  # OPTIONAL: Setting a maximum Java heap size of 1 Gib
            volumes:
                - path/to/ngrams:/ngrams
            restart: always
        

        Just download the ngrams from this link and change the path in the file. You can use this with a Firefox or Thunderbird addon. Go to the advanced settings in the addon preferences and add http(s)://YOUR_IP_OR_DOMAIN/v2 in the other server option.

        References: