mac code is a local AI coding agent designed to run large language models directly on Apple Silicon machines without relying on cloud services, effectively transforming a Mac into a self-contained AI development environment. The project focuses on enabling models that traditionally exceed available RAM to run efficiently by streaming model weights from SSD storage, thereby overcoming hardware limitations through innovative memory management techniques. It operates as a CLI-based assistant that routes user prompts into different execution paths such as chat, shell commands, or web search, functioning as a multi-purpose development agent. The system integrates with inference engines like llama.cpp and Apple’s MLX framework, allowing users to run models up to 35B parameters locally with varying performance trade-offs.

Features

  • Local execution of large language models without cloud dependency
  • SSD-based weight streaming to run models beyond RAM limits
  • LLM-as-router system for chat, shell, and search tasks
  • Integration with llama.cpp and MLX backends
  • Persistent KV cache for long-context and session continuity
  • Support for large MoE models with optimized performance techniques

Project Samples

Project Activity

See All Activity >

Categories

AI Coding

Follow mac code

mac code Web Site

Other Useful Business Software
Earn up to 16% annual interest with Nexo. Icon
Earn up to 16% annual interest with Nexo.

More flexibility. More control.

Generate interest, access liquidity without selling, and execute trades seamlessly. All in one platform. Geographic restrictions, eligibility, and terms apply.
Get started with Nexo.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of mac code!

Additional Project Details

Operating Systems

Mac

Programming Language

Python

Related Categories

Python AI Coding Tool

Registered

5 hours ago