Page Assist - A Web UI for Local AI Models icon

Page Assist - A Web UI for Local AI Models

Extension Actions

CRX ID
jfgfiigpkhlkbnfnbobbkinehhfdhndo
Status
  • Extension status: Featured
  • Live on Store
Description from extension meta

Use your locally running AI models to assist you in your web browsing.

Image from store
Page Assist - A Web UI for Local AI Models
Description from store

Page Assist - A Sidebar and Web UI for Your Local AI Models

Utilize your own AI models running locally to interact with while you browse or as a web UI for your local AI model provider like Ollama, Chrome AI etc..

Repo: https://github.com/n4ze3m/page-assist

Current Features:

- Sidebar for various tasks
- Support for vision models
- A minimal web UI for local AI models
- Internet search
- Chat with PDF on Sidebar
- Chat wit Documents (pdf,csv,txt,md,docx)
- Tab Mention (beta)

Supported Providers:

- Ollama
- [Beta] Chrome AI (Gemini Nano)
- OpenAI-compatible API support (LLaMA.cpp, LM Studio, Llamafile, vLLM and many more providers).

Latest reviews

Daniel Jaquet
Amazing!!
no no
Works incredibly well
Henrique Matos
Surprisingly good!
Meir Michanie
a must have
Yc F
nice tool
Anurag Vohra
Wonderfull tool. Thank you.
Anish Neupane
Just awesome!
Günther Bosch
The configuration needs some time and understanding of what you are doing. AI newbies might have a hard time. The extension works very well and the developer is very quick in responding questions and fixes. Also, the extension code is shared on GitHub and visible to everybody. This is a great plus when you are concerned about privacy.
Ilia Beliakov
Amaizing extension. Just simply contains all that you need to work with ollama models on board
Bryan M.
Amazing stuff.. I wish I could give Page Assist access to download llm's to external USB drives.
Kevin Oh
Awesome front-end for ollama. Would be great if we could pass cli arguments to ollama, since some models (Qwen3) seem to have "thinking" on by default. The slider in the UI for "Thinking" shows "false" but it's on by default and must be explicitly turned off.
Hristo Kolev
Works awesome, but why no option to summarize YouTube videos?! That's essential!
Jason
Very impressed. I've been using Open WebUI, which is a lot of trouble installing and setting up, only to get super slow LLM web search. Then I found Page Assist and wow, It's so easily set up and the web search is just fast. I only hope the "brain" icon can be changed to something better looking and Page Assist can be saved as a Web APP.
Donna Colburn
Please add a login/logout feature. I mean, the ability to automatically back up chat history or prompts via email/Gmail authentication, so it can be accessed universally across all browsers at the same time.
Naz Home
One of the simpler web UIs for Local AI
Skyro
ONE PROBLEM - Web Search is Very Slow and there is need to Add DuckDuckGo API option for AI's search which is also Free for Unlimited Requests and is Fast too, Only This is Needed Otherwise Its best Lightweight Offline AI Extension.
Сергій
Good!
C M
Couldn't be any easier. Just wish i could specify what drive to store the models.
Time for GameDev
Top baby
Daniel Vagg
I’ve tried out a bunch of extensions for using local LLMs or custom endpoints, and this one, is by far the best I’ve come across so far. It’s super easy to set up, letting me connect my locally hosted models or custom endpoints without any major headaches. The interface is intuitive and there are tons of customisation options that make it really flexible for both personal and professional projects. I haven’t even started diving into all the custom prompts and things yet, but from what I’ve seen so far, this extension feels great! Hats off to the dev or team that built this. Star it on Github if you have an account!
Bodong Chen
An amazing extension. One small issue -- the sidebar mode cannot be opened in Arc Browser. Apparently it's not the extension's problem but I wonder whether this extension can be updated to work with this situation.
Lunique
surper simpa set extension
Marcos A. Sobrinho
Pretty nice!! Thanks.
Yiannis Sofologis
this is what I was looking for, perfect for chatting privately with pages using my local qwen2.5:1.5b
Stu Rodgers
Great extension! Love the simplistic style and the functions it provides.
Luyao Xu
Very good extension tool for chrome. expecially the web search function!!!
M KB
Thank you for this
Dawid Ryba
Great work!
Su Mark
The system prompt appears to be malfunctioning.
HZ97 #凛fam
Nice!
于朝阳
The plugin is very easy to use, and it also has the function of a knowledge base, so I am a little curious about how the files uploaded to the knowledge base are processed, and is it sent directly to the large model for training? Ask an expert to help you answer
George Wells (padeoe)
Great work! Thanks!
ma xin
so nice, really work! hope UI more beautiful!
Qingjun Li
Good to use
Yuhe Ji
Great expansion. However, I noticed that when opening the sidebar with the shortcut key, there is a loading time of a few seconds before the model selection box appears. This issue does not occur when opening a separate page or when the 'restore last chat' option is checked while opening the sidebar.
Vic Hav
Good extension. Be aware, if you use local llm like ollama, and it happens that ollama is not running, browser will freeze at start indefinitely...
Mojtaba Ranjbar
simple and cool
翼影
Will there be an AI tavern feature developed in the future? If this feature is released, I think more players who like to play AI will pay attention to it and use it. If you can put character cards on it, it will save time.
Lssodi Nowei
I like!!!
yl test
Access to fetch at 'http://127.0.0.1:11434/api/tags' from origin 'chrome-extension://jfgfiigpkhlkbnfnbobbkinehhfdhndo' has been blocked by CORS policy: The 'Access-Control-Allow-Origin' header has a value 'http://127.0.0.1' that is not equal to the supplied origin. Have the server send the header with a valid value, or, if an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
莊承澤
This extension program is very useful. However, I find that the message viewing area is too narrow (my screen has a 4K resolution with a 32” LCD). Could you please add a drop-down menu that allows us to adjust the width of the window? Additionally, could you add a button for clearing the chat conversation history? Thank you for your work.
M Smart
Tried with ollama Deepseek R1:32b. Works perfectly.
Gerg
works very well with deepseek r1. im surprised my pc even runs it without crashing
B
Wow, very easy to use
Dior Wang (Dyeus)
Very Nice Extension. It saves my storage and time for installing local WebUI like Open WebUI and Docker
Farid Parma
best extention for Chrome
Shishira Bhat
Best extension in the world
KLaus Z
Great work, so useful when deploying a local AI model
Kevin Lee
why could I not pull models? I have use cloud studio to run ollama with deepseek-r1:8b and llama3:latest. But the url added to page assist shows ollama could run, but without models! I do have deepseek-r1:8b and llama3:latest, how could that possible? Could anyone to solve the problems? Thx!
joe pressnell
you are a legend and you will be the next tech on my contact list of heros