ChatGLM.cpp is a C++ implementation of the ChatGLM-6B model, enabling efficient local inference without requiring a Python environment. It is optimized for running on consumer hardware.
Features
- Provides a C++ implementation of ChatGLM-6B
- Supports running models on CPU and GPU
- Optimized for low-memory hardware and edge devices
- Allows quantization for reduced resource consumption
- Works as a lightweight alternative to Python-based inference
- Offers real-time chatbot capabilities
License
MIT LicenseFollow ChatGLM.cpp
Other Useful Business Software
8 Monitoring Tools in One APM. Install in 5 Minutes.
AppSignal works out of the box for Ruby, Elixir, Node.js, Python, and more. 30-day free trial, no credit card required.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of ChatGLM.cpp!