Agentic, Reasoning, and Coding (ARC) foundation models
Powerful AI language model (MoE) optimized for efficiency/performance
⚡ Building applications with LLMs through composability ⚡
SimpleMem: Efficient Lifelong Memory for LLM Agents
A New Axis of Sparsity for Large Language Models
Ongoing research training transformer models at scale
This repository provides an advanced RAG
Open-weight, large-scale hybrid-attention reasoning model
Open-source large language model family from Tencent Hunyuan
Phi-3.5 for Mac: Locally-run Vision and Language Models
Large-language-model & vision-language-model based on Linear Attention
Code for Language models can explain neurons in language models paper
Learn AI and LLMs from scratch using free resources
Code for the paper Fine-Tuning Language Models from Human Preferences