LLM X
LLMX; Easiest 3rd party Local LLM UI for the web
...The platform allows users to connect to various model providers, including local setups such as Ollama, enabling a unified interface for different AI backends. It supports conversational interactions, prompt experimentation, and general-purpose text generation workflows. As a progressive web app, it can be installed on devices and used offline or with limited connectivity, depending on the backend configuration. The system is designed for simplicity and accessibility, making it suitable for both casual users and developers who want a quick way to interact with models without complex setup.