Showing 28 open source projects for "tiny linux"

View related business solutions
  • Outgrown Windows Task Scheduler? Icon
    Outgrown Windows Task Scheduler?

    Free diagnostic identifies where your workflow is breaking down—with instant analysis of your scheduling environment.

    Windows Task Scheduler wasn't built for complex, cross-platform automation. Get a free diagnostic that shows exactly where things are failing and provides remediation recommendations. Interactive HTML report delivered in minutes.
    Download Free Tool
  • Build Secure Enterprise Apps Fast with Retool Icon
    Build Secure Enterprise Apps Fast with Retool

    Stop wasting engineering hours. Build secure, production-grade apps that connect directly to your company’s SQL and APIs.

    Create internal software that meets enterprise security standards. Retool connects to your business data—databases, APIs, and vector stores while ensuring compliance with granular permissions and audit logs. Whether on our cloud or self-hosted, build the dashboards and admin panels your organization needs without compromising on security or control.
    Learn More
  • 1
    HebbBrain

    HebbBrain

    FEED-FORWARD NETWORK

    FEED-FORWARD network. Simple to use, hard to manage. Born to be fast and tiny. AI FEED-FORWARD neural network
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    granite-timeseries-ttm-r2

    granite-timeseries-ttm-r2

    Tiny pre-trained IBM model for multivariate time series forecasting

    granite-timeseries-ttm-r2 is part of IBM’s TinyTimeMixers (TTM) series—compact, pre-trained models for multivariate time series forecasting. Unlike massive foundation models, TTM models are designed to be lightweight yet powerful, with only ~805K parameters, enabling high performance even on CPU or single-GPU machines. The r2 version is pre-trained on ~700M samples (r2.1 expands to ~1B), delivering up to 15% better accuracy than the r1 version. TTM supports both zero-shot and fine-tuned...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Ministral 3 3B Base 2512

    Ministral 3 3B Base 2512

    Small 3B-base multimodal model ideal for custom AI on edge hardware

    Ministral 3 3B Base 2512 is the smallest model in the Ministral 3 family, offering a compact yet capable multimodal architecture suited for lightweight AI applications. It combines a 3.4B-parameter language model with a 0.4B vision encoder, enabling both text and image understanding in a tiny footprint. As the base pretrained model, it is not fine-tuned for instructions or reasoning, making it the ideal foundation for custom post-training, domain adaptation, or specialized downstream tasks....
    Downloads: 0 This Week
    Last Update:
    See Project