...The library supports standard chat interactions, text generation, embeddings, and model management, which makes it useful for both simple chat interfaces and more advanced AI-powered workflows. It works in Node.js and also supports browser usage through a dedicated browser import, which broadens where it can be deployed. Streaming responses are built in, returning an async generator so applications can render output progressively instead of waiting for a full response. It also supports cloud-hosted usage by pointing the client at Ollama’s cloud endpoint with an API key, while preserving a familiar local-first workflow for developers who want to move between local and remote execution.