https://senad-d.github.io//posts/run-llm-locally/
Run LLM locally with Ollama and Open WebUI - Senad Dizdarevic