extension ExtPose

Page Assist - Веб-интерфейс для локальных моделей искусственного интеллекта

CRX id

jfgfiigpkhlkbnfnbobbkinehhfdhndo-

Description from extension meta

Используйте запущенные локально модели искусственного интеллекта для помощи в веб-просмотре.

Image from store Page Assist - Веб-интерфейс для локальных моделей искусственного интеллекта
Description from store Page Assist - Боковая панель и веб-интерфейс для ваших локальных моделей ИИ Используйте свои собственные модели ИИ, работающие локально, для взаимодействия во время просмотра или в качестве веб-интерфейса для вашего локального провайдера моделей ИИ, такого как Ollama, Chrome AI и др. Репозиторий: https://github.com/n4ze3m/page-assist Текущие функции: - Боковая панель для различных задач - Поддержка визуальных моделей - Минималистичный веб-интерфейс для локальных моделей ИИ - Поиск в Интернете - Чат с PDF на боковой панели - Чат с документами (pdf, csv, txt, md, docx) Поддерживаемые провайдеры: - Ollama - [Бета] Chrome AI (Gemini Nano) - [БЕТА] Поддержка API, совместимого с OpenAI (LM Studio, Llamafile и многие другие провайдеры).

Latest reviews

  • (2025-07-01) Kevin Oh: Awesome front-end for ollama. Would be great if we could pass cli arguments to ollama, since some models (Qwen3) seem to have "thinking" on by default. The slider in the UI for "Thinking" shows "false" but it's on by default and must be explicitly turned off.
  • (2025-06-29) Hristo Kolev: Works awesome, but why no option to summarize YouTube videos?! That's essential!
  • (2025-06-19) Jason: Very impressed. I've been using Open WebUI, which is a lot of trouble installing and setting up, only to get super slow LLM web search. Then I found Page Assist and wow, It's so easily set up and the web search is just fast. I only hope the "brain" icon can be changed to something better looking and Page Assist can be saved as a Web APP.
  • (2025-06-10) Donna Colburn: Please add a login/logout feature. I mean, the ability to automatically back up chat history or prompts via email/Gmail authentication, so it can be accessed universally across all browsers at the same time.
  • (2025-06-03) Naz Home: One of the simpler web UIs for Local AI
  • (2025-06-03) Skyro: ONE PROBLEM - Web Search is Very Slow and there is need to Add DuckDuckGo API option for AI's search which is also Free for Unlimited Requests and is Fast too, Only This is Needed Otherwise Its best Lightweight Offline AI Extension.
  • (2025-06-01) Сергій: Good!
  • (2025-05-26) C M: Couldn't be any easier. Just wish i could specify what drive to store the models.
  • (2025-05-24) Time for GameDev: Top baby
  • (2025-05-13) Daniel Vagg: I’ve tried out a bunch of extensions for using local LLMs or custom endpoints, and this one, is by far the best I’ve come across so far. It’s super easy to set up, letting me connect my locally hosted models or custom endpoints without any major headaches. The interface is intuitive and there are tons of customisation options that make it really flexible for both personal and professional projects. I haven’t even started diving into all the custom prompts and things yet, but from what I’ve seen so far, this extension feels great! Hats off to the dev or team that built this. Star it on Github if you have an account!
  • (2025-05-03) Bodong Chen: An amazing extension. One small issue -- the sidebar mode cannot be opened in Arc Browser. Apparently it's not the extension's problem but I wonder whether this extension can be updated to work with this situation.
  • (2025-04-23) Lunique: surper simpa set extension
  • (2025-04-10) Marcos A. Sobrinho: Pretty nice!! Thanks.
  • (2025-04-06) Yiannis Sofologis: this is what I was looking for, perfect for chatting privately with pages using my local qwen2.5:1.5b
  • (2025-04-02) Stu Rodgers: Great extension! Love the simplistic style and the functions it provides.
  • (2025-03-11) Luyao Xu: Very good extension tool for chrome. expecially the web search function!!!
  • (2025-02-26) M KB: Thank you for this
  • (2025-02-23) Dawid Ryba: Great work!
  • (2025-02-21) Su Mark: The system prompt appears to be malfunctioning.
  • (2025-02-21) HZ97 #凛fam: Nice!
  • (2025-02-19) 于朝阳: The plugin is very easy to use, and it also has the function of a knowledge base, so I am a little curious about how the files uploaded to the knowledge base are processed, and is it sent directly to the large model for training? Ask an expert to help you answer
  • (2025-02-18) George Wells (padeoe): Great work! Thanks!
  • (2025-02-15) ma xin: so nice, really work! hope UI more beautiful!
  • (2025-02-15) Qingjun Li: Good to use
  • (2025-02-13) Yuhe Ji: Great expansion. However, I noticed that when opening the sidebar with the shortcut key, there is a loading time of a few seconds before the model selection box appears. This issue does not occur when opening a separate page or when the 'restore last chat' option is checked while opening the sidebar.
  • (2025-02-10) Vic Hav: Good extension. Be aware, if you use local llm like ollama, and it happens that ollama is not running, browser will freeze at start indefinitely...
  • (2025-02-10) Mojtaba Ranjbar: simple and cool
  • (2025-02-09) 翼影: Will there be an AI tavern feature developed in the future? If this feature is released, I think more players who like to play AI will pay attention to it and use it. If you can put character cards on it, it will save time.
  • (2025-02-08) Lssodi Nowei: I like!!!
  • (2025-02-07) yl test: Access to fetch at 'http://127.0.0.1:11434/api/tags' from origin 'chrome-extension://jfgfiigpkhlkbnfnbobbkinehhfdhndo' has been blocked by CORS policy: The 'Access-Control-Allow-Origin' header has a value 'http://127.0.0.1' that is not equal to the supplied origin. Have the server send the header with a valid value, or, if an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
  • (2025-02-05) 莊承澤: This extension program is very useful. However, I find that the message viewing area is too narrow (my screen has a 4K resolution with a 32” LCD). Could you please add a drop-down menu that allows us to adjust the width of the window? Additionally, could you add a button for clearing the chat conversation history? Thank you for your work.
  • (2025-02-04) M Smart: Tried with ollama Deepseek R1:32b. Works perfectly.
  • (2025-02-03) Gerg: works very well with deepseek r1. im surprised my pc even runs it without crashing
  • (2025-02-03) B: Wow, very easy to use
  • (2025-01-31) Dior Wang (Dyeus): Very Nice Extension. It saves my storage and time for installing local WebUI like Open WebUI and Docker
  • (2025-01-30) Farid Parma: best extention for Chrome
  • (2025-01-30) Shishira Bhat: Best extension in the world
  • (2025-01-28) KLaus Z: Great work, so useful when deploying a local AI model
  • (2025-01-25) Kevin Lee: why could I not pull models? I have use cloud studio to run ollama with deepseek-r1:8b and llama3:latest. But the url added to page assist shows ollama could run, but without models! I do have deepseek-r1:8b and llama3:latest, how could that possible? Could anyone to solve the problems? Thx!
  • (2025-01-09) joe pressnell: you are a legend and you will be the next tech on my contact list of heros
  • (2024-12-31) Nikhil Swami: Well polished, feature packed
  • (2024-12-15) Firebeast979: I’ve been using this plugin for a while now, and I’m really impressed with how well it works. It’s fantastic at summarizing and handles a variety of tasks effortlessly. It’s also super fast, and the clean, user-friendly interface makes it a joy to use. That said, I think it could be even better with a couple of additions. For example, it would be awesome if you could attach an image to a system prompt, and have it display alongside it. Another cool feature would be the ability to chat with multiple models at the same time, which would add a lot of flexibility. Other than that, this is an amazing extension, and I highly recommend it to anyone!
  • (2024-12-11) R T: Love this extension. Compared to AnythingLLM and OpenwebUI, both of them frequently time out when using with Ollama, but PageAssist gets it right, doesn't waste valuable time spent composing a well crafted prompt to have it lost because it "times out". You can save and export chats and settings, and manage a full array of LLM settings and models. The only feature I would like to see is having the ability to configure multiple remote servers/LLM's so I can chat with different models at the same time (while waiting for a response).
  • (2024-12-10) David S: Works great, thanks!
  • (2024-11-16) Yiannis Ravanis: Great extension, but the sidebar functionality doesn't seem to work when using the Arc browser in MacOS. I left click somewhere on the page -> Open Copilot To Chat and nothing happens! :(
  • (2024-10-27) Manuel Herrera Hipnotista y Biomagnetismo: Wow, really nice extension, it feels fine to work with local llm and ollama in a useful and practical way. thanx
  • (2024-10-23) Ritch Cuvier: I like this application. I wish it could use two different URLs or two different ports in the OLLama URLs. That would allow me to utilize multiple PCS.
  • (2024-10-20) Poyraz Akkaya: Great but i cannot use it because i use lm studio please add support for lm studio because i have an a intel arc gpu and ollama doesn't support it😊 NOTE:Thank you developer for adding it 😊
  • (2024-10-14) Leonardo Grando: Congrats and thanks for bring us this extension. Fantastic
  • (2024-10-10) Tristan Nguyen: love it

Statistics

Installs
300,000 history
Category
Rating
4.8973 (185 votes)
Last update / version
2025-07-03 / 1.5.20.1
Listing languages

Links