Menu

Introducing DaNNet

A new framework for Deep Artificial Neural Networks is born. The intention is to keep it small and without too much dependencies to other libraries.
DaNNet is only dependent upon Armadillo (and if you want it faster also BLAS and Lapack).
The DaNNet differs from other frameworks (as far as I know) in that each layer may be optimized completely independent of all other layer optimization metods and parameters. For example you might have a structure as:
* Input
* Conv(5x5x6), SGD(0.05)
* ReLU
* Conv(5x5x10), ADAgrad(0.02)
* etc
In other words, you have a total freedom to choose what ever combinations you like.
So far it only supports CPU calculations but the performance is on par with Keras/Tensorflow but a bit slower than Caffe.
Verified at Linux (gnu toolchain) and Windows (Visual Studio 2017)

Posted by Claes Rolen 2019-02-13 | Draft

Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.