Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. We can continue to differentiate as many times as we like, and use numpy's vectorization of scalar-valued functions across many different input values.

Features

  • Simple neural net
  • Convolutional neural net
  • Recurrent neural net
  • LSTM
  • Neural Turing Machine
  • Backpropagating through a fluid simulation

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Autograd

Autograd Web Site

Other Useful Business Software
Stop Cyber Threats with VM-Series Next-Gen Firewall on Azure Icon
Stop Cyber Threats with VM-Series Next-Gen Firewall on Azure

Native application identity and user-based security for your Azure cloud

Gain integrated visibility across all traffic in a single pass. Deploy Palo Alto Networks VM-Series to determine application identity and content while automating security policy updates via rich APIs.
Get a free trial
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Autograd!

Additional Project Details

Programming Language

Python

Related Categories

Python Source Code Analysis Tool

Registered

2021-10-12