And no i do not have the privilege of running a local model. I have heard of a AI called Maple and tried it out, it was pretty limited to the point that it was a deal breaker (25 messages per week cap). I would like to know more services

  • Truscape@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    “If something is free, you are the product”

    If you’re not providing your own compute power or your money (and you’re not safe even if you pay most providers), expect the payment to come from your personal information or conversation contents being harvested. I’m not sure what magical service you’re expecting that doesn’t involve self-hosting for truly private conversations - you’re just accessing someone else’s machine if you’re not hosting it yourself, and you’re at their mercy.

  • floquant@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    9
    ·
    4 months ago

    Paid commercial LLM providers already profit off of inputs more than outputs, so I wouldn’t trust any free cloud offering to be private. If you really want free and private, self-hosted is the only way to go

  • Trent@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    OpenRouter has some decently powerful free-to-use models, but I’m afraid as far as LLMs go ‘free’, ‘good’, and ‘private’ are going to be pretty mutually exclusive if you can’t run one locally.

  • iturnedintoanewt@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    4 months ago

    What’s “the privilege of running a local model”? If it’s on a computer, it’s not much of a privilege, gpt4all can get up and running in a minute. For mobile phones RAM is more of an issue.

    • Normo!@lemdro.idOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I do not have a good enough internet connection to download local modals (Very unstable and slow). Downloading even an OS ISO file is not possible for me and i download them via friends. I don’t think anyone would accept my request to download a 10GB+ file

  • Da Oeuf@slrpnk.net
    link
    fedilink
    arrow-up
    6
    ·
    4 months ago

    I use Jan. It’s really good. A minimal install needs around 10GB of space and you can use it offline. The ‘jan nano’ model works faster than I can read in most cases.

  • stupid_asshole69 [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    No.

    The hardware needed to host even just a full size gpt3 costs tens of thousands of dollars, requires high current or high voltage circuits not usually accessible in residential homes and will actually use multiple thousands of watts of power.

    If someone is giving access to such an expensive, power hungry industrial system for free they’re either doing it to learn from the inputs (not private) or don’t have a commercially viable system (not good).