Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. We can continue to differentiate as many times as we like, and use numpy's vectorization of scalar-valued functions across many different input values.

Features

  • Simple neural net
  • Convolutional neural net
  • Recurrent neural net
  • LSTM
  • Neural Turing Machine
  • Backpropagating through a fluid simulation

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Autograd

Autograd Web Site

You Might Also Like
Red Hat Enterprise Linux on Microsoft Azure Icon
Red Hat Enterprise Linux on Microsoft Azure

Deploy Red Hat Enterprise Linux on Microsoft Azure for a secure, reliable, and scalable cloud environment, fully integrated with Microsoft services.

Red Hat Enterprise Linux (RHEL) on Microsoft Azure provides a secure, reliable, and flexible foundation for your cloud infrastructure. Red Hat Enterprise Linux on Microsoft Azure is ideal for enterprises seeking to enhance their cloud environment with seamless integration, consistent performance, and comprehensive support.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Autograd!

Additional Project Details

Programming Language

Python

Related Categories

Python Source Code Analysis Tool

Registered

2021-10-12