Description from extension meta
This extension hosts an ollama-ui web server on localhost
Image from store
Description from store
Just a simple HTML UI for Ollama
Source: https://github.com/ollama-ui/ollama-ui
Latest reviews
- (2025-05-12) Kertijayan Labs: i could say it is so weightlight. easy to use, thank you for developing tools like this.
- (2025-02-13) Mohammad Kanawati: Easy, light, and just let you chat. However, I wish if it has more customization, or system prompt, or memory section. As for the multiline, I wish if it was the opposite (Enter for enter, and CTRL+Enter for multiline)
- (2024-10-31) Sultan Papağanı: Please update it and add more features. its awesome (send enter settings, upload-download image if it doesnt exist, export chat .txt, rename chat saving title (it ask our name ? it should say Chat name or something))
- (2024-10-27) Manuel Herrera Hipnotista y Biomagnetismo: Simple solutions, as all effective things are. thanx
- (2024-08-08) Bill Gates Lin: How to setting prompt
- (2024-08-08) Damien PEREZ (Dadamtp): Yep, it's true, only work with Ollama on localhost. But my Ollama turn on another server exposed by openweb-ui. So I made a reverse proxy http://api.ai.lan -> 10.XX.XX.XX:11435 But the extension can't access it. Then I also tested with the direct IP : http://10.1.33.231:11435 But you force the default port: failed to fetch -> http://10.1.33.231:11435:11434/api/tags Finally, I made a ssh tunnel: ssh -L 11434:localhost:11435 [email protected] It's work, but not sexy
- (2024-08-06) Fabricio cincunegui: i wish a low end firendly GUI for ollama. you made it thanks
- (2024-05-19) Frédéric Demers: wonderful extension, easy to get started with local large language models without needed a web server, etc.... would you consider inserting a MathJax library in the extension so that equations are rendered correctly? something like <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/3.2.2/es5/latest.min.js?config=TeX-AMS-MML_HTMLorMML"> </script> or package mathjax as a local resource perhaps....
- (2024-05-09) Hadi Shakiba: It's great to have access to such a useful tool. Having 'copy' and 'save to txt' buttons would be a fantastic addition!
- (2024-05-01) Daniel Pfeiffer: I like it! So far the best way to easily chat with a local model in an uninterrupted way.
- (2024-04-25) Luis Hernández: Nice work. Just curious, how did you manage to get around Ollama’s limitation of only accepting POSTs from localhost, since the extension originates from chrome-extension://? Regards,
- (2024-04-25) Frédéric HAUGUEL: Can't use "Enter" key to submit message.
- (2024-04-15) Angel Cinelli: Great Work Congratulation !!!. Inside, a local computer running Ollama server runs beautifully !!!. However, trying to run this Ollama UI chrome extension from a client PC I found that it is not working !!!! Running it in the client computer, I can get information about the different LLM models present in the server PC hosting Ollama and also send an inquiry which reaches the Ollama Server. But I don't get any answer in the client computer and the message "Failed to fetch" appears . I exposed Ollama via 0.0.0.0 on the local network by the setting an eviroment variable in the host PC, and I also disabled the firewall in both the client and the host PCs, but the problem is still present !!! Do you have any suggestions in this regard ???
- (2024-04-01) Giovanni Garcia: This is a fantastic jumping off point for Ollama users. Some things I would consider adding: -Resource monitor(RAM, GPU, etc..) -Size of model being selected in the drop down menu -Accessibility from any device on same network -Ability to pause a conversation with 1 model and switch to another -Chat with docs capability. Overall this really is a great product already.
- (2024-03-29) 1981 jans: ollama is not compatible with quadro k6000 card , with nvidia gtx 650 card , is only for rich people? if you want to be recognize your work , work also for the poor people , not only for richs!!!!
- (2024-02-16) Razvan Bolohan (razvanab): Not bad, I just wish I could use it with Vison models too.
- (2024-02-03) Justin Doss: Thank you for developing this. Well done.
- (2023-12-29) SuperUUID: Really cool extenstion.
- (2023-12-18) fatty mcfat: Awsome extension, easiest setup
- (2023-12-05) Mo Kahlain: Well done !! and very useful
- (2023-11-21) MSTF: Well done! Thx for the useful extension
- (2023-11-21) Victor Tsaran: Really awesome extension. I wish I could upload documents / files to the LLM, but that's more of a feature request!
- (2023-11-16) Horia Cristescu: works great! I only wish I could send the current page in the context
- (2023-11-13) Victor Jung: Made life easier for me as i could not restart ollama from the Windows CLI. Kudos to the team who develop this extension