Surprised pikachu face

      • utopiahEnglish
        arrow-up
        10
        arrow-down
        0
        ·
        1 month ago
        link
        fedilink

        I like Ollama, and recommend it to tinker, but I admit this LLM Explorer” is quite neat thanks to sections like LLMs Fit 16GB VRAM

        Ollama just works but it doesn’t help to pick which model best fits your needs.

        • Knock_Knock_Lemmy_InEnglish
          arrow-up
          2
          arrow-down
          0
          ·
          1 month ago
          link
          fedilink

          pick which model best fits your needs.

          What is the need I have to put the effort in to install all this locally. Websites win in terms of convenience.

          • utopiahEnglish
            arrow-up
            2
            arrow-down
            0
            ·
            1 month ago
            link
            fedilink

            I don’t think I understand your point, are you saying there is no benefit in running locally and that Websites or APIs are more convenient?

            • Knock_Knock_Lemmy_InEnglish
              arrow-up
              1
              arrow-down
              0
              ·
              1 month ago
              link
              fedilink

              I already have stable diffusion on a local machine. I was trying to find motivation to install a LLM locally. You answered my question in a different response

              use cases where customization helps while quality does matter much due to scale, i.e spam, then LLMs and related tools are amazing.

          • morriscoxEnglish
            arrow-up
            2
            arrow-down
            0
            ·
            1 month ago
            link
            fedilink

            I want to work on my stuff in peace and in private without worrying about a company grabbing my stuff and using it for themselves and to give/sell it to other outfits, including the government. “If you have nothing to hide is bullshit and needs to die.

            • Knock_Knock_Lemmy_InEnglish
              arrow-up
              1
              arrow-down
              0
              ·
              1 month ago
              link
              fedilink

              Good point. Everything you feed into chatgpt is stored for future reference.