Local LLM Helper
Interact with your local LLM server directly from your browser.
2 extensions
Interact with your local LLM server directly from your browser.
Chat with your local LLM right from your browser. Works with Ollama, LM Studio, and any OpenAI-compatible server.