Ollama Sidekick icon

Ollama Sidekick

Extension Actions

How to install Open in Chrome Web Store
CRX ID
ggeleclijedgoacjldhaefjcaofgkeli
Description from extension meta

Chat with your local Ollama AI models in a side panel. Includes webpage content as context. All data stays on your device.

Image from store
Ollama Sidekick
Description from store

Ollama Sidekick is a browser side panel interface for chatting with your locally-hosted Ollama AI models. All conversations happen entirely on your machine - no data is sent to external servers.

WHAT THIS EXTENSION DOES

This extension provides a chat interface that connects to Ollama running on your local computer. You can ask questions about any webpage you're viewing, and the extension will include the page content as context for the AI.

Key capabilities:
- Chat with local Ollama models in a side panel
- Automatically extract content from the current webpage as context
- Highlight text on any page to include it in your question
- Manage multiple chat conversations
- Switch between any Ollama models you have installed

REQUIREMENTS

This extension requires Ollama to be installed and running locally:

1. Install Ollama from ollama.com
2. Download a model: ollama pull llama3.2
3. Start Ollama with CORS enabled:
- macOS/Linux: OLLAMA_ORIGINS='*' ollama serve
- Windows (Command Prompt): set OLLAMA_ORIGINS=* && ollama serve
- Windows (PowerShell): $env:OLLAMA_ORIGINS='*'; ollama serve
4. Click the extension icon to open the side panel

The extension connects only to localhost:11434 (the default Ollama port). No external servers are contacted.

HOW PAGE CONTEXT WORKS

When you visit a webpage, the extension can read the page content and send it to your local Ollama server along with your question. This lets you ask things like:
- "Summarize this article"
- "Explain this code"
- "What are the main points?"

You can enable or disable page context with a toggle in the extension.

PRIVACY

- All processing happens locally on your device
- No account or sign-up required
- No data collection or analytics
- No external API calls
- Chat history is stored only in your browser

SUPPORTED MODELS

Works with any model available in Ollama, including Llama, Mistral, Gemma, CodeLlama, and others. The extension detects which models you have installed and lets you switch between them.

TROUBLESHOOTING

If you see a connection error:

1. Make sure Ollama is installed (download from ollama.com)

2. Stop any existing Ollama process:
- macOS/Linux: pkill ollama
- Windows: Close Ollama from the system tray or use Task Manager

3. Start Ollama with CORS enabled:
- macOS/Linux: OLLAMA_ORIGINS='*' ollama serve
- Windows (Command Prompt): set OLLAMA_ORIGINS=* && ollama serve
- Windows (PowerShell): $env:OLLAMA_ORIGINS='*'; ollama serve

4. Click the retry button in the extension

Windows users: If you installed Ollama as a desktop app, you may need to set the OLLAMA_ORIGINS environment variable in System Settings > Environment Variables, then restart the Ollama app.