A new framework for Deep Artificial Neural Networks is born. The intention is to keep it small and without too much dependencies to other libraries.
DaNNet is only dependent upon Armadillo (and if you want it faster also BLAS and Lapack).
The DaNNet differs from other frameworks (as far as I know) in that each layer may be optimized completely independent of all other layer optimization metods and parameters. For example you might have a structure as:
* Input
* Conv(5x5x6), SGD(0.05)
* ReLU
* Conv(5x5x10), ADAgrad(0.02)
* etc
*
In other words, you have a total freedom to choose what ever combinations you like.
So far it only supports CPU calculations but the performance is on par with Keras/Tensorflow but a bit slower than Caffe.
Verified at Linux (gnu toolchain) and Windows (Visual Studio 2017)