extension ExtPose

open-os LLM Browser Extension

CRX id

kgeinnbgpilffgaipgihigcphcokellk-

Description from extension meta

Quick access to your favorite local LLM from your browser (Ollama).

Image from store open-os LLM Browser Extension
Description from store Small open-source extension for Chromium-based browsers like Chrome, Brave, or Edge to quickly access your favorite local AI LLM assistant while browsing. Some features of this first alpha version: - Compatible with any LLM model included in Ollama (Llama3, Phi3, Mistral, Gemma...) - Selector for on-the-fly switching between installed models - Direct access via the sidebar (in supported browsers) and through a new tab - Markdown format support - Text rendering by tokens (stream) - Simple and lightweight design - Open source Simply click on the extension icon and start chatting with your virtual assistant. Right-click on the extension icon to open a new tab. Make sure you have installed Ollama, and it is running: Download Ollama: https://ollama.com/ Install any of the available models on Ollama. For example, for LLama3 from META, type "ollama run llama3:8b" in your OS terminal. Llama-3 Installation video tutorial: https://www.youtube.com/watch?v=7ujZ1N4Pmz8

Statistics

Installs
19 history
Category
Rating
0.0 (0 votes)
Last update / version
2024-05-01 / 0.5
Listing languages
en

Links