Menu

i-merge / Blog: Recent posts

Pipeline execution

We like it or not, we must prepare to reuse a memory allocated network bacause we can't allocate them all in what relates to the weights in the context of an ensemble method.

What is the best way to do so?

Remembering our old computer architecture courses at school, we may notice that we have all the qualities for a "RISK" pipeline execution. In a risk architecture, the instruction set is and must be simple. For instructions taking multiple cycles, if there is enough similarity between the instructions, we can begin the first cycle of an instruction and then execute the second cycle of the instruction at the next physical layer while beginning the first cycle of the next instruction at the first physical layer. Then the first instruction may execute its third cycle on the third physical layer, the second instruction its second cycle on the second physical layer and a new third instruction begins its first cycle on the first physical layer. Etc. Hence multiple instructions are computed at the same time much like multiple cars get assembled at the same time in an assembly line. The overall throughput is increased.... read more

Posted by Francis Girard 2012-08-10

OpenCL for BPNN

Lot of things may be parallelized when "executing" a bpnn :

  1. Each multiplication and addition of the dot product to compute the input signal of a given neuron : there will be Nb_neurons_on_previous_layer multiplications and as many additions.
  2. The activation function (a sigmoid computation for example). There will be as many such computation as there are neurons (inputs and biases excluded of course)
  3. The copy of the weights array (see previous post) from the host
  4. Computation of different neurons activation on the same layer
  5. Computation of different instances of networks (same network structure but different weights)... read more
Posted by Francis Girard 2012-08-07

Weights layout

For connections incoming into a given neuron

Weights are best viewed as labelling incoming edges for a given neuron. If a neuron is on layer i then the weights of the incoming connections from the layer i-1 will be given by the following array :

weight-from-Bias, weight-from-neuron-1, weight-from-neuron-2, ..., weight-from-neuron-N

where N is the number of actual real neuron from layer i-1.... read more

Posted by Francis Girard 2012-08-07
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.