

If all you want is efficient web quality I’d highly recommend grabbing the cjpegli
binary out of the latest static release from the libjxl repo and using the following command to slim down images with near-zero visual quality loss: cjpegli -q 90 -p 2 "input.jpg" "output.jpg"
. It uses modern encoding techniques derived from the next-gen JPEG-XL project and stuffs the result back into a regular old JPEG container. Replace “90” with e.g. 90/92/95 depending on the quality level you want to target. After playing around with some of the quality levels and checking the resultant filesizes you should be able to get a feel for what you can reasonably get away with for the resolution and makeup of a particular image. If you still can’t get it small enough, you probably need to start reducing the resolution as well.
In terms of what size an average image should be for Threadiverse purposes, I’d shoot for 0.5-1MB. If it’s just a meme or something with value not intrinsic to its image quality I’d aim lower, whereas if it’s something OC like photography I’d bump the quality higher (or maybe have a web-quality version available on click with a higher quality version hosted elsewhere).
Better yet, why put yourself at the mercy of something that can enshittify in the first place? I’ve never understood why people get into selfhosting and then go right back to giving power over their network to a 3rd party again.