Quick access to your favorite local LLM from your browser (Ollama).
Ollama UI. Small open-source extension for Chromium-based browsers like Chrome, Brave, or Edge to quickly access your favorite local AI LLM assistant while browsing.
Some features of this version:
- Compatible with any LLM model included in Ollama (Llama3, Phi3, Mistral, Gemma...)
- Selector for on-the-fly switching between installed models
- Direct access via the sidebar (in supported browsers) and through a new tab
- Markdown format support
- Text rendering by tokens (stream)
- Simple and lightweight design
- Theme support. Light, Dark, NOSTROMO COMPUTER MU-TH-UR 6000 and Retro Terminal (MSDOS "Perfect DOS VGA")
- Pre-promt and scenarios to quickly load your LLM in the right mood :)
- Customization options: font size, local user and LLM name, header text
- Open source
Simply click on the extension icon and start chatting with your virtual assistant. Right-click on the extension icon to open a new tab.
Make sure you have installed Ollama, and it is running:
Download Ollama: https://ollama.com/
Install any of the available models on Ollama. For example, for LLama3 from META, type "ollama run llama3:8b" in your OS terminal.
Llama-3 Installation video tutorial: https://www.youtube.com/watch?v=7ujZ1N4Pmz8
Statistics
Installs
355
history
Category
Rating
4.5 (2 votes)
Last update / version
2024-09-19 / 0.91
Listing languages
en