Download Latest Version example_rF.tar.gz (80.5 kB)
Email in envelope

Get an email when there's a new version of R scripts for multivariate analysis

Home / AMORE
Name Modified Size InfoDownloads / Week
Parent folder
example_AMORE.tar.gz 2014-09-05 78.4 kB
run_AMORE.R 2014-09-05 12.8 kB
LICENSE.txt 2014-09-05 35.1 kB
README.txt 2014-09-05 2.8 kB
Totals: 4 Items   129.1 kB 0
# A script for AMORE
# Author:
# Pezhman Kazemi: pezhman.kazemi@uj.edu.pl; pezhman.kazemi@gmail.com
# Co-Authors:  
# Aleksander Mendyk: mfmendyk@cyf-kr.edu.pl; aleksander.mendyk@uj.edu.pl
# Adam Pacławski: adam.paclawski@uj.edu.pl
# Jakub Szlęk: j.szlek@uj.edu.pl
# License: GPLv3
# This program comes with ABSOLUTELY NO WARRANTY! USE IT AT YOR OWN RISK!
# Acknowledgment:  This work was supported by the IPROCOM Marie Curie initial training network, funded through the People Programme (Marie Curie Actions) of the European
# Union’s Seventh Framework Programme FP7/2007-2013/ under REA grant agreement No. 316555.



This is the script prepared to run R - A MORE flexible neural network package (AMORE) from R environment in a manner of 10-fold cross-validation mode. AMORE is a Neural Network-based data analysis system. For more information please refer to its original manual at the CRAN repostitory:

http://cran.r-project.org/web/packages/AMORE/index.html

It requires data prepared in a manner of 10 pairs of training-testing datasets in tab-delimited TXT files, where the last column contains the known answer to the problem (dependent variable) and previous columns are features (regressors or independent variables).

Script basic adjustable parameters are:

n_neurons
Numeric vector containing the number of neurons of each layer. The first element of the vector is the number of input neurons, the last is the number of output neurons and the rest are the number of neuron of the different hidden layers.

learning_rate_global
Learning rate at which every neuron is trained.

momentum_global
Momentum for every neuron. Needed by several training methods.

hidden_layer
Activation function of the hidden layer neurons. Available functions are:
"purelin".
"tansig".
"sigmoid".
"hardlim".
"custom": The user must manually define the f0 and f1 elements of the neurons.

output_layer
Activation function of the hidden layer neurons according to the former list shown above.

method 
Prefered training method. Currently it can be:
"ADAPTgd": Adaptative gradient descend.
"ADAPTgdwm": Adaptative gradient descend with momentum.
"BATCHgd": BATCH gradient descend.
"BATCHgdwm": BATCH gradient descend with momentum


Please find an example of how to run the script in the example.tar.gz archive Runs on Linux and Mac 
Windows users must get rid of multicore library

##########################################################################
    This program comes with ABSOLUTELY NO WARRANTY
    This is free software, and you are welcome to redistribute it
    under certain conditions. Please find a LICENSE file to look 
    for a more detailed description of terms and conditions based on the 
    GNU GPLv3 license
##########################################################################
Source: README.txt, updated 2014-09-05