Download Latest Version Exe2Image.jar (12.4 kB)
Email in envelope

Get an email when there's a new version of maXbox

Home / Lazarus / CAI
Name Modified Size InfoDownloads / Week
Parent folder
MachineLearningPackage.zip 2023-02-19 24.3 MB
1076_SimpleImageClassifierEKON25_5000.nn 2021-11-17 1.0 MB
1076_CAI_Ugradientascent.pas 2021-11-17 14.1 kB
1076_ugradientascent.pas 2021-11-17 12.0 kB
1076_visuallearner_gradient.png 2021-11-17 150.8 kB
GradientAscent.res 2021-11-17 139.1 kB
ugradientascent.lfm 2021-11-17 1.9 kB
ugradientascent.pas 2021-11-17 8.7 kB
GradientAscent.ico 2021-11-17 137.0 kB
GradientAscent.lpr 2021-11-17 549 Bytes
README.md 2021-11-17 1.9 kB
GradientAscent.lpi 2021-11-17 5.6 kB
Totals: 12 Items   25.8 MB 0

Gradient Ascent Example

It's usually very hard to understand neuron by neuron how a neural network dedicated to image classification internally works. One technique used to help with the understanding about what individual neurons represent is called Gradient Ascent.

In this technique, an arbitrary neuron is required to activate and then the same backpropagation method used for learning is applied to an input image producing an image that this neuron expects to see.

To be able to run this example, you'll need to load an already trained neural network file and then select the layer you intend to visualize.

Deeper convolutional layers tend to produce more complex patterns. Above image was produced from the a first convolutional layer. The following image was produced from a third convolutional layer. Notice that patterns are more complex.

This is the API method used for an arbitrary neuron backpropagation (Gradient Ascent):

procedure TNNet.BackpropagateFromLayerAndNeuron(LayerIdx, NeuronIdx: integer; Error: TNeuralFloat);

Errors on the input image aren't enabled by default. In this example, errors regarding the input image are enabled with this:

TNNetInput(FNN.Layers[0]).EnableErrorCollection();

Then, errors are added to the input with this:

vInput.MulAdd(-1, FNN.Layers[0].OutputError);
FNN.ClearDeltas();
FNN.ClearInertia();

You can find more about Gradient Ascent at: * Lecture 12: Visualizing and Understanding - CS231n - Stanford * Understanding Neural Networks Through Deep Visualization

Source: README.md, updated 2021-11-17