Ollama Server
No need for Termux, you can start the Ollama service
...The application includes basic model lifecycle management such as downloading official models, uploading custom GGUF files, and controlling running instances. It also enables offline inference, allowing users to run language models entirely on-device without requiring a network connection or external infrastructure.