• sabreW4K3English
    arrow-up
    0
    arrow-down
    0
    ·
    4 months ago
    link
    fedilink

    Is there a YouTube video under 10 minutes that compares the different AI models available from DuckDuckGo?

    • OtterOPEnglish
      arrow-up
      0
      arrow-down
      0
      ·
      4 months ago
      link
      fedilink

      A lot of it might come down to individual tasks or personal preference.

      Personally I liked Claude better than GTP3.5 for general queries, and I have yet to explore the other two

    • simpleEnglish
      arrow-up
      0
      arrow-down
      0
      ·
      4 months ago
      link
      fedilink

      Dunno, but Llama 3 is the best open source model and Claude 3 is the best overall model they offer.

    • HowdyEnglish
      arrow-up
      0
      arrow-down
      0
      ·
      4 months ago
      edit-2
      4 months ago
      link
      fedilink

      I use mixtral8x7b locally and it’s been great. I am genuinely excited to see ddg offering it and the service in general. Now I can use this service when not on my network.

      • rutrumEnglish
        arrow-up
        0
        arrow-down
        0
        ·
        4 months ago
        link
        fedilink

        What GPU are you using to run it? And what UI are you using to interface with it? (I know of gpt4all and the generic sounding ui-text-generation program or something)

        • HowdyEnglish
          arrow-up
          0
          arrow-down
          0
          ·
          4 months ago
          edit-2
          4 months ago
          link
          fedilink

          I am using this: https://github.com/oobabooga/text-generation-webui It is running great with my AMD 7900XT. It also ran great with my 5700xt. It sets up itself within a conda virtual environment so it takes all mess out of getting the packages to work correctly. It can use NVIDIA cards too.

          Once you get it installed you can then get your models from huggingface.co

          I’m on arch, btw. ;)

          Edit: I just went and reinstalled it and saw it supports these gpus

          https://lemmy.zip/pictrs/image/ba2ed8c4-3f15-4ed3-8204-47595403ed88.webp

          • rutrumEnglish
            arrow-up
            0
            arrow-down
            0
            ·
            4 months ago
            link
            fedilink

            That’s right, “text-generation-webui”. At least its unambiguous lol. Thanks for sharing.

        • pflanzenregalEnglish
          arrow-up
          0
          arrow-down
          0
          ·
          4 months ago
          link
          fedilink

          Open-webui is the best self hosted LLM chat interface IMO. It works seamlessly with Ollama, but also supports other openAI-API compatible APIs AFAIK.

          I’m using both in combination with each other and both downloading and using models is super easy. Also integrates well with VSCode extension “Continue, an open source Copilot alternative (setup might require editing the extension’s config file).