Hello internet users. I have tried gpt4all and like it, but it is very slow on my laptop. I was wondering if anyone here knows of any solutions I could run on my server (debian 12, amd cpu, intel a380 gpu) through a web interface. Has anyone found any good way to do this?

  • johntashEnglish
    arrow-up
    7
    arrow-down
    0
    ·
    8 months ago
    link
    fedilink

    Ollama and localai can both be run on a server with no gpu. You’d need to point a different web ui to them if you want though