PanGu-Σ

PanGu-Σ

Huawei
+
+

Related Products

  • LM-Kit.NET
    23 Ratings
    Visit Website
  • Vertex AI
    783 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website
  • RaimaDB
    9 Ratings
    Visit Website
  • RunPod
    205 Ratings
    Visit Website
  • ActCAD Software
    401 Ratings
    Visit Website
  • AmpiFire
    22 Ratings
    Visit Website
  • Ganttic
    239 Ratings
    Visit Website
  • Datasite Diligence Virtual Data Room
    611 Ratings
    Visit Website
  • Windsurf Editor
    156 Ratings
    Visit Website

About

GLM-4.7 FlashX is a lightweight, high-speed version of the GLM-4.7 large language model created by Z.ai that balances efficiency and performance for real-time AI tasks across English and Chinese while offering the core capabilities of the broader GLM-4.7 family in a more resource-friendly package. It is positioned alongside GLM-4.7 and GLM-4.7 Flash, delivering optimized agentic coding and general language understanding with faster response times and lower resource needs, making it suitable for applications that require rapid inference without heavy infrastructure. As part of the GLM-4.7 model series, it inherits the model’s strengths in programming, multi-step reasoning, and robust conversational understanding, and it supports long contexts for complex tasks while remaining lightweight enough for deployment with constrained compute budgets.

About

Significant advancements in the field of natural language processing, understanding, and generation have been achieved through the expansion of large language models. This study introduces a system which utilizes Ascend 910 AI processors and the MindSpore framework to train a language model with over a trillion parameters, specifically 1.085T, named PanGu-{\Sigma}. This model, which builds upon the foundation laid by PanGu-{\alpha}, takes the traditionally dense Transformer model and transforms it into a sparse one using a concept known as Random Routed Experts (RRE). The model was efficiently trained on a dataset of 329 billion tokens using a technique called Expert Computation and Storage Separation (ECSS), leading to a 6.3-fold increase in training throughput via heterogeneous computing. Experimentation indicates that PanGu-{\Sigma} sets a new standard in zero-shot learning for various downstream Chinese NLP tasks.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI engineers and teams building AI applications who need a lightweight, efficient variant of a high-performance large language model for fast, scalable text generation and coding tasks

Audience

AI developers

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

No images available

Pricing

$0.07 per 1M tokens
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

Z.ai
Founded: 2019
China
docs.z.ai/guides/llm/glm-4.7#glm-4-7-flashx

Company Information

Huawei
Founded: 1987
China
huawei.com

Alternatives

Alternatives

LTM-1

LTM-1

Magic AI
GLM-4.5V-Flash

GLM-4.5V-Flash

Zhipu AI
PanGu-α

PanGu-α

Huawei
MiMo-V2-Flash

MiMo-V2-Flash

Xiaomi Technology
Falcon 3

Falcon 3

Technology Innovation Institute (TII)
Florence-2

Florence-2

Microsoft

Categories

Categories

Integrations

PanGu Chat

Integrations

PanGu Chat
Claim GLM-4.7-FlashX and update features and information
Claim GLM-4.7-FlashX and update features and information
Claim PanGu-Σ and update features and information
Claim PanGu-Σ and update features and information