extension ExtPose

SecureAI Chat: Chat with local AI (LLaMA 3, Mistral, and more — no cloud)

CRX id

nabflaimaoncpnhiecjnaddgkmejnffm-

Description from extension meta

Private AI assistant that works with local LLMs (like LLaMA 3) using Ollama. Summarize pages, ask anything — all without cloud APIs.

Image from store SecureAI Chat: Chat with local AI (LLaMA 3, Mistral, and more — no cloud)
Description from store Chat with AI locally – no cloud required. This Chrome extension lets you interact with powerful open-source LLMs like LLaMA 3, running locally via Ollama, directly from any browser tab. 🔒 100% Private: Your data stays on your machine—no tracking, no cloud APIs, no data sent to external servers. 💬 Key Features: Ask questions and get intelligent answers instantly Summarise selected web content with one click Launch a floating chat window on any page Choose between multiple installed local models (e.g., LLaMA, Mistral, etc.) Clean, lightweight UI with a fast response 🛠️ Powered by Ollama – A simple, secure way to run local LLMs on your system. Whether you're researching, coding, reading, or brainstorming — this extension brings the full power of local AI into your browser. Disclaimer: This extension is not affiliated with or endorsed by Meta, Ollama, or Google. “LLaMA”, “Chrome”, and “Ollama” are trademarks of their respective owners.

Latest reviews

  • (2025-07-25) Suhail Khan: Nice extension. I tried running gemma3-1b model on my laptop, it worked well. there might be some improvement required on the summarize feature. overall, it's good.

Statistics

Installs
12 history
Category
Rating
5.0 (2 votes)
Last update / version
2025-07-24 / 1.0.0
Listing languages
en

Links