Reins
Ollama client that simplifies experimenting with LLMs
...It provides a highly customizable chat interface where users can configure system prompts, switch models dynamically, and adjust inference parameters such as temperature, token limits, and context size on a per-conversation basis. The application is built to run across platforms including mobile and desktop environments, making it accessible for a wide range of users who want consistent control over their AI workflows. It also includes features for editing and regenerating messages, enabling iterative refinement of outputs without restarting conversations. ...