Showing 44 open source projects for "neural network code in matlab"

View related business solutions
  • Cloud-based help desk software with ServoDesk Icon
    Cloud-based help desk software with ServoDesk

    Full access to Enterprise features. No credit card required.

    What if You Could Automate 90% of Your Repetitive Tasks in Under 30 Days? At ServoDesk, we help businesses like yours automate operations with AI, allowing you to cut service times in half and increase productivity by 25% - without hiring more staff.
    Try ServoDesk for free
  • Keep company data safe with Chrome Enterprise Icon
    Keep company data safe with Chrome Enterprise

    Protect your business with AI policies and data loss prevention in the browser

    Make AI work your way with Chrome Enterprise. Block unapproved sites and set custom data controls that align with your company's policies.
    Download Chrome
  • 1
    Neural Network Intelligence

    Neural Network Intelligence

    AutoML toolkit for automate machine learning lifecycle

    Neural Network Intelligence is an open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning. NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate feature engineering, neural architecture search, hyperparameter tuning and model compression. The tool manages automated machine learning (AutoML) experiments, dispatches and runs experiments'...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 2
    Bumblebee

    Bumblebee

    Pre-trained Neural Network models in Axon

    Bumblebee provides pre-trained Neural Network models on top of Axon. It includes integration with Models, allowing anyone to download and perform Machine Learning tasks with few lines of code. The best way to get started with Bumblebee is with Livebook. Our announcement video shows how to use Livebook's Smart Cells to perform different Neural Network tasks with a few clicks.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Sonnet

    Sonnet

    TensorFlow-based neural network library

    Sonnet is a neural network library built on top of TensorFlow designed to provide simple, composable abstractions for machine learning research. Sonnet can be used to build neural networks for various purposes, including different types of learning. Sonnet’s programming model revolves around a single concept: modules. These modules can hold references to parameters, other modules and methods that apply some function on the user input.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    CoreNet

    CoreNet

    CoreNet: A library for training deep neural networks

    CoreNet is Apple’s internal deep learning framework for distributed neural network training, designed for high scalability, low-latency communication, and strong hardware efficiency. It focuses on enabling large-scale model training across clusters of GPUs and accelerators by optimizing data flow and parallelism strategies. CoreNet provides abstractions for data, tensor, and pipeline parallelism, allowing models to scale without code duplication or heavy manual configuration. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Grafana: The open and composable observability platform Icon
    Grafana: The open and composable observability platform

    Faster answers, predictable costs, and no lock-in built by the team helping to make observability accessible to anyone.

    Grafana is the open source analytics & monitoring solution for every database.
    Learn More
  • 5
    Gen.jl

    Gen.jl

    A general-purpose probabilistic programming system

    ...Users can also hand-code parts of their models that demand better performance. Neural network inference is fast, but can be inaccurate on out-of-distribution data, and requires expensive training.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    TensorFlow.js

    TensorFlow.js

    TensorFlow.js is a library for machine learning in JavaScript

    TensorFlow.js is a library for machine learning in JavaScript. Develop ML models in JavaScript, and use ML directly in the browser or in Node.js. Use off-the-shelf JavaScript models or convert Python TensorFlow models to run in the browser or under Node.js. Retrain pre-existing ML models using your own data. Build and train models directly in JavaScript using flexible and intuitive APIs. Tensors are the core datastructure of TensorFlow.js They are a generalization of vectors and matrices to...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 7
    Opacus

    Opacus

    Training PyTorch models with differential privacy

    Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the client, has little impact on training performance, and allows the client to online track the privacy budget expended at any given moment. Vectorized per-sample gradient computation that is 10x faster than micro batching. Supports most types of PyTorch models and can be used with minimal modification to the original neural network. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    MIVisionX

    MIVisionX

    Set of comprehensive computer vision & machine intelligence libraries

    MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX delivers highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions along with Convolution Neural Net Model Compiler & Optimizer supporting ONNX, and Khronos NNEF™ exchange formats. The toolkit allows for rapid prototyping and deployment of optimized computer vision and machine learning...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    PyG

    PyG

    Graph Neural Network Library for PyTorch

    PyG (PyTorch Geometric) is a library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data. It consists of various methods for deep learning on graphs and other irregular structures, also known as geometric deep learning, from a variety of published papers. In addition, it consists of easy-to-use mini-batch loaders for operating on many small and single giant graphs, multi GPU-support, DataPipe support,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Use the Filerev app to organize Google Drive and reduce storage costs. Icon
    Use the Filerev app to organize Google Drive and reduce storage costs.

    Find and Remove Duplicates, Organize Effectively, and Manage Storage Smartly.

    Use the Filerev app to organize Google Drive and reduce storage costs. The Duplicate File Finder quickly shows all duplicates in Google Drive without downloading all of your files. The Storage Analyzer lets you browse your folders by size in Google Drive. You can also view your Google Drive files in different categories such as: hidden / orphaned files, large files, empty files, empty folders, large folders, old files, temporary files, files by extension. Every category includes powerful filters and the ability to bulk delete your files in Google Drive. Plus, there are charts and graphs to help you quickly see how your storage space is being used and the number of files and specific types of files or folders that are consuming the most space. You can get started for free to see what is consuming your Google Drive storage space and quickly clean up the clutter.
    Learn More
  • 10
    Lightweight' GAN

    Lightweight' GAN

    Implementation of 'lightweight' GAN, proposed in ICLR 2021

    Implementation of 'lightweight' GAN proposed in ICLR 2021, in Pytorch. The main contribution of the paper is a skip-layer excitation in the generator, paired with autoencoding self-supervised learning in the discriminator. Quoting the one-line summary "converge on single gpu with few hours' training, on 1024 resolution sub-hundred images". Augmentation is essential for Lightweight GAN to work effectively in a low data setting. You can test and see how your images will be augmented before...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    FairChem

    FairChem

    FAIR Chemistry's library of machine learning methods for chemistry

    FAIRChem is a unified library for machine learning in chemistry and materials, consolidating data, pretrained models, demos, and application code into a single, versioned toolkit. Version 2 modernizes the stack with a cleaner core package and breaking changes relative to V1, focusing on simpler installs and a stable API surface for production and research. The centerpiece models (e.g., UMA variants) plug directly into the ASE ecosystem via a FAIRChem calculator, so users can run relaxations,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    Neural Network Visualization

    Neural Network Visualization

    Project for processing neural networks and rendering to gain insights

    nn_vis is a minimalist visualization tool for neural networks written in Python using OpenGL and Pygame. It provides an interactive, graphical representation of how data flows through neural network layers, offering a unique educational experience for those new to deep learning or looking to explain it visually. By animating input, weights, activations, and outputs, the tool demystifies neural network operations and helps users intuitively grasp complex concepts. Its lightweight codebase is...
    Downloads: 2 This Week
    Last Update:
    See Project
  • 13
    Alpa

    Alpa

    Training and serving large-scale neural networks

    Alpa is a system for training and serving large-scale neural networks. Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training and serving these large-scale neural networks require complicated distributed system techniques. Alpa aims to automate large-scale distributed training and serving with just a few lines of code.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    AlphaZero.jl

    AlphaZero.jl

    A generic, simple and fast implementation of Deepmind's AlphaZero

    Beyond its much publicized success in attaining superhuman level at games such as Chess and Go, DeepMind's AlphaZero algorithm illustrates a more general methodology of combining learning and search to explore large combinatorial spaces effectively. We believe that this methodology can have exciting applications in many different research areas. Because AlphaZero is resource-hungry, successful open-source implementations (such as Leela Zero) are written in low-level languages (such as C++)...
    Downloads: 18 This Week
    Last Update:
    See Project
  • 15
    NeuMan

    NeuMan

    Neural Human Radiance Field from a Single Video (ECCV 2022)

    NeuMan is a reference implementation that reconstructs both an animatable human and its background scene from a single monocular video using neural radiance fields. It supports novel view and novel pose synthesis, enabling compositional results like transferring reconstructed humans into new scenes. The pipeline separates human/body and environment, learning consistent geometry and appearance to support animation. Demos showcase sequences such as dance and handshake, and the code provides...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 16
    NanoNeuron

    NanoNeuron

    NanoNeuron is 7 simple JavaScript functions

    Nano-Neuron is a didactic project that reduces the idea of a neuron to a handful of tiny JavaScript functions so learners can see “learning” in action without heavy frameworks. It demonstrates how a scalar input can be linearly transformed with a weight and bias, then adjusted via gradient updates to fit a simple mapping such as Celsius-to-Fahrenheit conversion. The code emphasizes readability over performance, inviting you to step through calculations and watch parameters converge. Because...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    Fairseq

    Fairseq

    Facebook AI Research Sequence-to-Sequence Toolkit written in Python

    Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling and other text generation tasks. We provide reference implementations of various sequence modeling papers. Recent work by Microsoft and Google has shown that data parallel training can be made significantly more efficient by sharding the model parameters and optimizer state across data parallel workers. These ideas are encapsulated in the...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    TensorNetwork

    TensorNetwork

    A library for easy and efficient manipulation of tensor networks

    TensorNetwork is a high-level library for building and contracting tensor networks—graphical factorizations of large tensors that underpin many algorithms in physics and machine learning. It abstracts networks as nodes and edges, then compiles efficient contraction orders across multiple numeric backends so users can focus on model structure rather than index bookkeeping. Common network families (MPS/TT, PEPS, MERA, tree networks) are expressed with concise APIs that encourage...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    Nerfies

    Nerfies

    This is the code for Deformable Neural Radiance Fields

    Nerfies demonstrates deformation-aware neural radiance fields that reconstruct and render dynamic, real-world scenes from casual video. Instead of assuming a static world, the method learns a canonical space plus a deformation field that maps changing poses or expressions back to that space during training. This lets the system generate photorealistic novel views of nonrigid subjects—faces, bodies, cloth—while preserving fine detail and consistent lighting. The training pipeline handles...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    Lucid

    Lucid

    A collection of infrastructure and tools for research

    Lucid is a collection of infrastructure and tools for research in neural network interpretability. Lucid is research code, not production code. We provide no guarantee it will work for your use case. Lucid is maintained by volunteers who are unable to provide significant technical support. Start visualizing neural networks with no setup. The following notebooks run right from your browser, thanks to Collaboratory.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    fastNLP

    fastNLP

    fastNLP: A Modularized and Extensible NLP Framework

    fastNLP is a lightweight framework for natural language processing (NLP), the goal is to quickly implement NLP tasks and build complex models. A unified Tabular data container simplifies the data preprocessing process. Built-in Loader and Pipe for multiple datasets, eliminating the need for preprocessing code. Various convenient NLP tools, such as Embedding loading (including ELMo and BERT), intermediate data cache, etc.. Provide a variety of neural network components and recurrence models (covering tasks such as Chinese word segmentation, named entity recognition, syntactic analysis, text classification, text matching, metaphor resolution, summarization, etc.). ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    Machine Learning Octave

    Machine Learning Octave

    MatLab/Octave examples of popular machine learning algorithms

    This repository contains MATLAB / Octave implementations of popular machine learning algorithms, along with explanatory code and mathematical derivations, intended as educational material rather than production code. Implementations of supervised learning algorithms (linear regression, logistic regression, neural nets). The author’s goal is to help users understand how each algorithm works “from scratch,” avoiding black-box library calls.
    Downloads: 5 This Week
    Last Update:
    See Project
  • 23
    The Neural Process Family

    The Neural Process Family

    This repository contains notebook implementations

    Neural Processes (NPs) is a collection of interactive Jupyter/Colab notebook implementations developed by Google DeepMind, showcasing three foundational probabilistic machine learning models: Conditional Neural Processes (CNPs), Neural Processes (NPs), and Attentive Neural Processes (ANPs). These models combine the strengths of neural networks and stochastic processes, allowing for flexible function approximation with uncertainty estimation. They can learn distributions over functions from...
    Downloads: 4 This Week
    Last Update:
    See Project
  • 24
    ResNeXt

    ResNeXt

    Implementation of a classification framework

    ResNeXt is a deep neural network architecture for image classification built on the idea of aggregated residual transformations. Instead of simply increasing depth or width, ResNeXt introduces a new dimension called cardinality, which refers to the number of parallel transformation paths (i.e. the number of “branches”) that are aggregated together. Each branch is a small transformation (e.g. bottleneck block) and their outputs are summed—this enables richer representation without excessive parameter blowup. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    CrypTen

    CrypTen

    A framework for Privacy Preserving Machine Learning

    CrypTen is a research framework developed by Facebook Research for privacy-preserving machine learning built directly on top of PyTorch. It provides a secure and intuitive environment for performing computations on encrypted data using Secure Multiparty Computation (SMPC). Designed to make secure computation accessible to machine learning practitioners, CrypTen introduces a CrypTensor object that behaves like a regular PyTorch tensor, allowing users to seamlessly apply automatic...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • 2
  • Next