Autograd
Efficiently computes derivatives of numpy code
...It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. We can continue to differentiate as many times as we like, and use numpy's vectorization of scalar-valued functions across many different input values.