8 projects for "mac software" with 2 filters applied:

  • Cloud tools for web scraping and data extraction Icon
    Cloud tools for web scraping and data extraction

    Deploy pre-built tools that crawl websites, extract structured data, and feed your applications. Reliable web data without maintaining scrapers.

    Automate web data collection with cloud tools that handle anti-bot measures, browser rendering, and data transformation out of the box. Extract content from any website, push to vector databases for RAG workflows, or pipe directly into your apps via API. Schedule runs, set up webhooks, and connect to your existing stack. Free tier available, then scale as you need to.
    Explore 10,000+ tools
  • Trumba is an All-in-one Calendar Management and Event Registration platform Icon
    Trumba is an All-in-one Calendar Management and Event Registration platform

    Great for live, virtual and hybrid events

    Publish, promote and track your events more affordably and effectively—all in one place.
    Learn More
  • 1
    FlashMLA

    FlashMLA

    FlashMLA: Efficient Multi-head Latent Attention Kernels

    FlashMLA is a high-performance decoding kernel library designed especially for Multi-Head Latent Attention (MLA) workloads, targeting NVIDIA Hopper GPU architectures. It provides optimized kernels for MLA decoding, including support for variable-length sequences, helping reduce latency and increase throughput in model inference systems using that attention style. The library supports both BF16 and FP16 data types, and includes a paged KV cache implementation with a block size of 64 to...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    Granite Code Models

    Granite Code Models

    A Family of Open Foundation Models for Code Intelligence

    Granite Code Models are IBM’s open-source, decoder-only models tailored for code tasks such as fixing bugs, explaining and documenting code, and modernizing codebases. Trained on code from 116 programming languages, the family targets strong performance across diverse benchmarks while remaining accessible to the community. The repository introduces the model lineup, intended uses, and evaluation highlights, and it complements IBM’s broader Granite initiative spanning multiple modalities....
    Downloads: 1 This Week
    Last Update:
    See Project
  • 3
    MiniMax-M2.1

    MiniMax-M2.1

    MiniMax M2.1, a SOTA model for real-world dev & agents.

    MiniMax-M2.1 is an open-source, state-of-the-art agentic language model released to democratize high-performance AI capabilities. It goes beyond a simple parameter upgrade, delivering major gains in coding, tool use, instruction following, and long-horizon planning. The model is designed to be transparent, controllable, and accessible, enabling developers to build autonomous systems without relying on closed platforms. MiniMax-M2.1 excels in real-world software engineering tasks, including...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    MiniMax-M1

    MiniMax-M1

    Open-weight, large-scale hybrid-attention reasoning model

    MiniMax-M1 is presented as the world’s first open-weight, large-scale hybrid-attention reasoning model, designed to push the frontier of long-context, tool-using, and deeply “thinking” language models. It is built on the MiniMax-Text-01 foundation and keeps the same massive parameter budget, but reworks the attention and training setup for better reasoning and test-time compute scaling. Architecturally, it combines Mixture-of-Experts layers with lightning attention, enabling the model to...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Repair-CRM Icon
    Repair-CRM

    For small companies that repair and maintenance customer machines

    All-In-One Solution with an Online Booking portal for automating scheduling & dispatching to ditch paperwork and improve the productivity of your technicians!
    Learn More
  • 5
    Qwen2.5-Coder

    Qwen2.5-Coder

    Qwen2.5-Coder is the code version of Qwen2.5, the large language model

    Qwen2.5-Coder, developed by QwenLM, is an advanced open-source code generation model designed for developers seeking powerful and diverse coding capabilities. It includes multiple model sizes—ranging from 0.5B to 32B parameters—providing solutions for a wide array of coding needs. The model supports over 92 programming languages and offers exceptional performance in generating code, debugging, and mathematical problem-solving. Qwen2.5-Coder, with its long context length of 128K tokens, is...
    Downloads: 11 This Week
    Last Update:
    See Project
  • 6
    Devstral Small 2

    Devstral Small 2

    Lightweight 24B agentic coding model with vision and long context

    Devstral Small 2 is a compact agentic language model designed for software engineering workflows, excelling at tool usage, codebase exploration, and multi-file editing. With 24B parameters and FP8 instruct tuning, it delivers strong instruction following while remaining lightweight enough for local and on-device deployment. The model achieves competitive performance on SWE-bench, validating its effectiveness for real-world coding and automation tasks. It introduces vision capabilities,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Devstral 2

    Devstral 2

    Agentic 123B coding model optimized for large-scale engineering

    Devstral 2 is a large-scale agentic language model purpose-built for software engineering tasks, excelling at codebase exploration, multi-file editing, and tool-driven automation. With 123B parameters and FP8 instruct tuning, it delivers strong instruction following for chat-based workflows, coding assistants, and autonomous developer agents. The model demonstrates outstanding performance on SWE-bench, validating its effectiveness in real-world engineering scenarios. It generalizes well...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Kimi K2

    Kimi K2

    Kimi K2: 1T-param MoE model for advanced coding and agentic reasoning

    Kimi K2 (K2-Instruct-0905) is a state-of-the-art Mixture-of-Experts (MoE) language model developed by Moonshot AI, designed for high-performance reasoning, coding assistance, and agentic task orchestration. It features 1 trillion total parameters with 32 billion activated per token, enabling strong efficiency while maintaining very high capability. Kimi K2 demonstrates major gains in real-world coding and tool-use benchmarks, especially in SWE-Bench, Terminal-Bench, and multilingual...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next