Download Latest Version v0.0.14 source code.zip (3.7 MB)
Email in envelope

Get an email when there's a new version of SuperAGI

Home / v0.0.14
Name Modified Size InfoDownloads / Week
Parent folder
README.md 2024-01-12 1.4 kB
v0.0.14 source code.tar.gz 2024-01-12 3.3 MB
v0.0.14 source code.zip 2024-01-12 3.7 MB
Totals: 3 Items   7.0 MB 1

:sparkles:SuperAGI v0.0.14:sparkles:

:rocket: Enhanced Local LLM Support with Multi-GPU :tada:

New Feature Highlights :star2:

⚙️ Local Large Language Model (LLM) Integration: - SuperAGI now supports the use of local large language models, allowing users to leverage their own models seamlessly within the SuperAGI framework. - Easily configure and integrate your preferred LLMs for enhanced customization and control over your AI agents.

⚡️ Multi-GPU Support: - SuperAGI now provides multi-GPU support for improved performance and scalability.

How to Use

To enable Local Large Language Model (LLM) with Multi-GPU support, follow these simple steps:

  1. LLM Integration:
  2. Add your model path in the celery and backend volumes in the docker-compose-gpu.yml file.
  3. Run the command: bash docker compose -f docker-compose-gpu.yml up --build
  4. Open localhost:3000 in your browser.
  5. Add a local LLM model from the model section.
  6. Use the added model for running your agents.

What’s Changed

  • Local LLM Integration with Multi-GPU Support by @rounak610 in #1391, #1351, #1306
Source: README.md, updated 2024-01-12