ChatGLM.cpp is a C++ implementation of the ChatGLM-6B model, enabling efficient local inference without requiring a Python environment. It is optimized for running on consumer hardware.

Features

  • Provides a C++ implementation of ChatGLM-6B
  • Supports running models on CPU and GPU
  • Optimized for low-memory hardware and edge devices
  • Allows quantization for reduced resource consumption
  • Works as a lightweight alternative to Python-based inference
  • Offers real-time chatbot capabilities

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow ChatGLM.cpp

ChatGLM.cpp Web Site

Other Useful Business Software
Enterprise-grade ITSM, for every business Icon
Enterprise-grade ITSM, for every business

Give your IT, operations, and business teams the ability to deliver exceptional services—without the complexity.

Freshservice is an intuitive, AI-powered platform that helps IT, operations, and business teams deliver exceptional service without the usual complexity. Automate repetitive tasks, resolve issues faster, and provide seamless support across the organization. From managing incidents and assets to driving smarter decisions, Freshservice makes it easy to stay efficient and scale with confidence.
Try it Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of ChatGLM.cpp!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

C++

Related Categories

C++ Large Language Models (LLM), C++ Natural Language Processing (NLP) Tool, C++ AI Models, C++ LLM Inference Tool

Registered

2025-01-21