This project was derived from the following project on CCodeChamp by MR CODER.
http://www.ccodechamp.com/c-program-of-multilayer-perceptron-net-using-backpropagation/
I rewrote the project in C++ and and made it more object oriented. Then, I added the capability to use dropout on the hidden layers as specified by Geoffrey Hinton et. al. in "Improving neural networks by preventing co-adaptation of feature detectors" (2012).
http://www.cs.toronto.edu/~hinton/absps/dropout.pdf
I found that it performed much worse with dropout. I suspect this is because I used a very small neural network. The paper used it on large neural networks. I also found that the outputs approach -1 and 1 when making the network larger. This may be due to the absence of an L2 penalty such as the one described in the dropout paper.
Downloads:
0 This Week