... the layers of the network but also the shapes of data that are passed between the layers. To perform back propagation, one can call the eponymous function which takes a network, appropriate input, and target data, and returns the back propagated gradients for the network. The shapes of the gradients are appropriate for each layer and may be trivial for layers like Relu which have no learnable parameters.