Menu

Tree [r42] /
 History

HTTPS access


File Date Author Commit
 bin 2011-03-14 gnusci [r1] initial import
 flws 2011-11-30 gnusci [r36] WIN32 port
 gui 2013-07-09 gnusci [r42] using fltk-config for portability
 include 2012-05-21 gnusci [r41] minor bug in the Makefile
 obj 2011-03-14 gnusci [r1] initial import
 prg 2011-11-27 gnusci [r32] tmatrix bugs fixed
 src 2011-11-28 gnusci [r34] added activation function selection
 History 2011-11-30 gnusci [r39] Win32 port working
 Makefile 2011-03-14 gnusci [r3] minor modification of head files
 README 2011-03-14 gnusci [r1] initial import
 TODO 2011-11-25 gnusci [r11] minor modifications
 license 2011-03-14 gnusci [r3] minor modification of head files
 makeinclude 2013-07-09 gnusci [r42] using fltk-config for portability
 version 2011-11-24 gnusci [r8] added TODO

Read Me

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
                      Directories

 include: head files
 fws:     FLUID GUI development workspace
 prg:     test source codes  programs
 src:     source files
 gui:     FLTK GUI interface (automatically generated by FLUID)
 obj:     temporal object files
 bin:     executable example programs

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
                   Compilation instructions

 Copy the compress file wherever you want to compile and untar it
 > $ tar -xvzf fl_anna.tar.gz
 > $ cd fl_anna
 > $ make
 you will find a test programs in the "bin" directory
 just type
 > ./bpanna
for no GUI interface or,
 > ./fl_bpann
 for a FLTK GUI application

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

 DOCUMENTATION:

 The basics steps are:

 read_data_file(file);          load a data file
 set_output(no);                Set number of outputs (no)
 set_input(ni);                 Set number of inputs (ni)
 set_layers(nl);                Set number of layers (nl)
 create();                      create the neural network framework
 set_neurons_in_layer(l,n);     Set number (n) of neurons of the layer (l)
 select_training_algorithm(b);  true = Levenverg-Maquardt (LM),
                                false = Conjugate gradient (CG)
 initialize();                  initialize the neural network weights
 set_alpha(alpha);              this is the CG learning rate
 set_mu(mu);                    this is the LM perturbation factor
 training();                    one step training
 evalue();                      evalue the net and give the result in a vector

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

 The  Artificial Neural Network Architecture  (ANNA)   is  a
 Back propagation neural network  class  easy  to  use  with
 the FLTK library.  So you will need to have FLTK-1.3 installed.

 You can get FLTK from www.fltk.org

 The ANNA class library is compose by the following files:

 anna_bpn_net.h anna_bpn_layer.h anna_bpn_neuron.h
 anna_bpn_net.cxx anna_bpn_layer.cxx anna_bpn_neuron.cxx

 The distribution include a demo which should work on Linux systems.


 So enjoy it!

                                       Edmanuel Torres
                                       eetorres@gmail.com

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.