6 projects for "framework-arduinoststm32" with 2 filters applied:

  • MongoDB Atlas runs apps anywhere Icon
    MongoDB Atlas runs apps anywhere

    Deploy in 115+ regions with the modern database for every enterprise.

    MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
    Start Free
  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • 1
    Exposure Correction

    Exposure Correction

    Learning multi-scale deep model correcting over- and under- exposed

    ...The repository focuses on correcting poorly exposed photographs, handling both underexposure and overexposure using a deep learning approach. The method employs a multi-scale framework that learns to enhance images by adjusting exposure levels across different spatial resolutions. This allows the model to preserve fine details while correcting global lighting inconsistencies. The repository includes pre-trained models, datasets, and training/testing code to enable reproducibility and experimentation. By leveraging this framework, researchers and developers can apply exposure correction to a wide range of natural images, improving visual quality without manual editing. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 2
    Megatron

    Megatron

    Ongoing research training transformer models at scale

    ...We developed efficient, model-parallel (tensor, sequence, and pipeline), and multi-node pre-training of transformer based models such as GPT, BERT, and T5 using mixed precision. Megatron is also used in NeMo Megatron, a framework to help enterprises overcome the challenges of building and training sophisticated natural language processing models with billions and trillions of parameters. Copyright (c) 2022, NVIDIA CORPORATION. All rights reserved.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 3
    Deep Learning Is Nothing

    Deep Learning Is Nothing

    Deep learning concepts in an approachable style

    Deep-Learning-Is-Nothing presents deep learning concepts in an approachable, from-scratch style that demystifies the stack behind modern models. It typically begins with linear algebra, calculus, and optimization refreshers before moving to perceptrons, multilayer networks, and gradient-based training. Implementations favor small, readable examples—often NumPy first—to show how forward and backward passes work without depending solely on high-level frameworks. Once the fundamentals are...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4
    vJEPA-2

    vJEPA-2

    PyTorch code and models for VJEPA2 self-supervised learning from video

    VJEPA2 is a next-generation self-supervised learning framework for video that extends the “predict in representation space” idea from i-JEPA to the temporal domain. Instead of reconstructing pixels, it predicts the missing high-level embeddings of masked space-time regions using a context encoder and a slowly updated target encoder. This objective encourages the model to learn semantics, motion, and long-range structure without the shortcuts that pixel-level losses can invite. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Level Up Your Cyber Defense with External Threat Management Icon
    Level Up Your Cyber Defense with External Threat Management

    See every risk before it hits. From exposed data to dark web chatter. All in one unified view.

    Move beyond alerts. Gain full visibility, context, and control over your external attack surface to stay ahead of every threat.
    Try for Free
  • 5
    GPT-NeoX

    GPT-NeoX

    Implementation of model parallel autoregressive transformers on GPUs

    This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training. For those looking for a TPU-centric codebase, we recommend Mesh Transformer JAX. ...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 6
    Robust Tube MPC

    Robust Tube MPC

    Example implementation for robust model predictive control using tube

    robust-tube-mpc is a MATLAB implementation of robust tube-based Model Predictive Control (MPC). The framework provides tools to design and simulate controllers that maintain stability and constraint satisfaction in the presence of bounded disturbances. Tube-based MPC achieves robustness by combining a nominal trajectory planner with an error feedback controller that keeps the actual system state within a "tube" around the nominal trajectory.
    Downloads: 4 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next