Operating LLMs in production
The Triton Inference Server provides an optimized cloud
Replace OpenAI GPT with another LLM in your app
Serving LangChain LLM apps automagically with FastApi
Build AI-powered semantic search applications
Open source annotation tool for machine learning practitioners
Open-source framework that gives you AI Agents
State-of-the-art Multilingual Question Answering research
A Deep-Learning-Based Chinese Speech Recognition System
Aseryla code repositories
A multi-modeling and simulation environment to study complex systems
Aims to enable researcher to tap in to mobile computing capability