Ollama Client - Chat with Local LLM Models icon

Ollama Client - Chat with Local LLM Models

Extension Actions

CRX ID
bfaoaaogfcgomkjfbmfepbiijmciinjl
Status
  • Live on Store
Description from extension meta

Privacy-first Ollama Chrome extension to chat with local AI models like LLaMA, Mistral, Gemma β€” fully offline.

Image from store
Ollama Client - Chat with Local LLM Models
Description from store

🧠 Ollama Client – Chat with Local LLMs Inside Your Browser

Ollama Client is a lightweight, privacy-first Ollama Chrome extension that brings the power of local AI models and offline AI chat directly to your browser. No cloud dependencies. No API keys. No data sent externally.

Just fast, secure, Ollama browser extension–powered offline AI chat powered by open-source models like LLaMA 3, GPT-OSS, Mistral, Gemma, CodeLLaMA β€” all running on your own machine using the Ollama backend.

✨ Works on all Chromium-based browsers (Chrome, Edge, Brave) and Firefox (with additional setup). 100% open-source.

πŸš€ Key Features
πŸ”Œ Local Ollama Integration – Connect to a local Ollama server (no API keys)
πŸ’¬ In-Browser Chat UI – Lightweight, minimal, fast (Ollama-ui alternative)
πŸ›‘οΈ 100% Local and Private – All storage and inference happen on your device (frontend interface for Ollama)
βš™οΈ Custom Settings – Control model parameters, themes, prompt templates
πŸ”„ Model Switcher – Switch between models in real time
πŸ” Model Search & Pull – Pull models directly in the UI (with progress indicator)
πŸ—‘οΈ Model Deletion with Confirmation – Clean up unused models from the UI
🧳 Load/Unload Models – Manage Ollama memory footprint efficiently
πŸŽ›οΈ Tune Parameters – Temperature, top_k, top_p, repeat penalty, stop sequences
🧠 Transcript & Page Summarization – Works with YouTube, Udemy, Coursera & web articles
πŸ”Š TTS – Built-in Text-to-Speech via Web Speech API
πŸ—‚οΈ Multi-Chat Sessions – Save/load/delete local chats
πŸ“€ Export Chat Sessions – Export single or all chat sessions as PDF or JSON
πŸ“₯ Import Chat Sessions – Import single or multiple chat sessions from JSON files
🧯 Declarative Net Request (DNR) – Automatic CORS handling(v0.1.3)
πŸ›‘οΈ 100% Local and Private – All storage and inference happen on your device
πŸ“‹ Copy & Regenerate – Quickly rerun or copy AI responses

🧭 Tab Access (Optional)

Want your LLM to understand the content of a page you're viewing? Enable Tab Access in the settings to fetch page content or transcripts for better contextual answers.

βœ”οΈ Fully opt-in
βœ”οΈ You choose which tabs to share
βœ”οΈ Customizable exclude list (regex supported)
βœ”οΈ No tab data ever leaves your device

βš™οΈ Installation & Setup

1️⃣ Install Ollama Client from the Chrome Web Store
2️⃣ Install Ollama on your machine from https://ollama.com and run `ollama serve`
3️⃣ Pull your favorite models (e.g., `ollama pull llama3:8b`, `gemma:2b`) and start chatting!

Advanced users can customize themes, model parameters, prompt templates, and excluded URLs from the Options page.

🎯 Who Should Use Ollama Client?

πŸ‘©β€πŸ’» Developers building with or debugging LLMs
πŸ“š Researchers who want local, private LLM interfaces
πŸŽ“ Students using AI as study aids on local hardware
πŸ” Privacy advocates avoiding cloud AI and APIs
πŸ€– AI tinkerers and open-source model enthusiasts

⚑ Performance & Hardware Recommendations

πŸ’» 8 GB RAM (no GPU): gemma:2b, mistral:7b-q4
πŸ’» 16 GB RAM (no GPU): gemma:3b-q4, gemma:2b
πŸš€ 16 GB+ with GPU (6GB VRAM): llama3:8b-q4, gemma:3b
πŸ’₯ 32 GB+ or high-end GPU: llama3:8b, codellama:13b
πŸ”₯ RTX 3090+, Apple M3 Max: llama3:70b, mixtral

Note: Ollama Client Chrome extension is a frontend interface only. All LLM generation happens via your local Ollama install. Speed and output depend on your system.

πŸ”— Useful Links

🌐 Chrome Web Store: https://chromewebstore.google.com/detail/ollama-client/bfaoaaogfcgomkjfbmfepbiijmciinjl
πŸ“˜ Setup Guide: https://ollama-client.shishirchaurasiya.in/ollama-setup-guide
πŸ’» Landing Page: https://ollama-client.shishirchaurasiya.in/
πŸ”’ Privacy Policy: https://ollama-client.shishirchaurasiya.in/privacy-policy
πŸ§‘β€πŸ’» GitHub: https://github.com/Shishir435/ollama-client
🐞 Bug: https://github.com/Shishir435/ollama-client/issues

πŸš€ Start chatting in seconds β€” private, fast, and fully local AI conversations on your own machine.

Built for developers, researchers, and anyone who values speed, privacy, and offline AI control.
#ollama #privacy #olama-client #opensource #offline #ollama-ui #ollamachat #gpt-oss

Latest reviews

Π•Π²Π³Π΅Π½ΠΈΠΉ Архипов
Context-1 Title: Untitled Content: ❌ Error: Could not establish connection. Receiving end does not exist.
Sunny
Amazing tool! Seamlessly brings local LLMs to the browser with privacy and speed.
Ravish Kumar
Really good, now i won't need to go to chatgpt tab all the time.
Lorent Felix
This extension does the job requested and the developer is very responsive. I recommend it and should be tested.
Victor Sarkar
This is the thing I have been looking for πŸ”₯πŸ”₯πŸ”₯