Of course, I’m not in favor of this “AI slop” that we’re having in this century (although I admit that it has some good legitimate uses but greed always speaks louder) but I wonder if it will suffer some kind of piracy, if it is already suffering or people simple are not interested in “pirated AI”

  • OminousOrange@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 day ago

    There are quite a few options for running your own LLM. Ollama makes it fairly easy to run (with a big selection of models - there’s also Hugging Face with even more models to suit various use cases) and OpenWebUI makes it easy to operate.

    Some self-hosting experience doesn’t hurt, but it’s pretty straightforward to configure if you follow along with Networkchuck in this video.

    • can@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      24 hours ago

      Any that are easier to set up on a phone? I tried something before but had trouble despite having enough RAM.

      • OminousOrange@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        23 hours ago

        Not that I’m familiar with. I would guess that the limited processing power of a phone would bring a pretty poor experience though.