GetProfile is a drop-in proxy layer that sits in front of your LLM provider to turn otherwise stateless chat requests into a system with persistent user profiles and long-term memory. Instead of forcing you to redesign your application, you route your model calls through GetProfile and it captures conversation context automatically as traffic flows. It then extracts structured traits and “memories” from those conversations, stores them, and injects the most relevant profile context back into future prompts so responses stay consistent and personalised over time. The goal is to make “memory” and user understanding an infrastructure concern rather than an app-by-app feature, so teams can add continuity with minimal code changes. Because it behaves like an OpenAI-compatible gateway, it can work with multiple providers and tools that already speak that API shape.
Features
- Drop-in proxy that captures user–AI conversations automatically
- Structured trait and memory extraction from chat history
- Automatic context injection into downstream prompts
- Continuous profile updates as new messages arrive
- OpenAI-compatible routing for multi-provider support
- Optional access control for protecting the proxy endpoint