With Ollama Web UI you'll not only get the easiest way to get your own Local AI running on your computer (thanks to Ollama), but it also comes with OllamaHub Support, where you can find Prompts, Modelfiles (to give your AI a personality) and more, all of that power by the community.
FixtSE Web: [ Ссылка ]
Main Project Page: [ Ссылка ]
00:00 Prerequisites
00:47 Install it on WSL
02:06 Docker Installation (Linux/WSL)
03:21 Activate GPU Compatibility
04:11 Installation
05:00 How to update it
05:19 Ollama WebUI
05:42 Install a New Model
06:36 Use your new model
07:17 OllamaHub
09:00 Windows Limitations
If you like my work, please consider supporting me on Ko-fi! ☕🎉: [ Ссылка ]
Patreon: [ Ссылка ]
or Join this channel to get access to perks:
[ Ссылка ]
You can find me on:
Web: [ Ссылка ]
Instagram: [ Ссылка ]
Hope this was useful and if you have any questions, write me a comment below
Thank you for watching (~ ̄▽ ̄)~
Ещё видео!