Menu

Home

Tal H

C/C++ Neural Networks
Introduction
This project was created in order to enable FeedForward Neural Network API for C programmers.
It enables training, saving, loading and activation of FeedForward neural networks.
The training is performed using Backpropagation algorithm.
The currently supported activation functions are Sigmoid, Tanh, Linear (sum of [inputs * weights]), Step Function (used mainly in Perceptron).
For perceptrons, it's recommended to use the Perceptron API

Special Features
- Starting from the version from 2015-10-29, it is enabled to accelerate the training process using the function "FeedForwardNetwork_train_fast()". In contrary to the usual training function "FeedForwardNetwork_train()", the new fast function also takes a pointer to a user function. The user's function gets 2 double arrays: an array of the expected results and an array of the actual network output. The user's function should return 1 in case that the actual output is satisfying and 0 otherwise. This process saves a lot of training time, skipping the back-propagation and weights updating for samples which are considered satisfying

This feature can be very useful is some cases:
- When you train your network to return a binary result. E.g. if you decide that a result greater than 0.5 means that a sample is classified as "true", you can implement your function to return 1 whenever the expected result is 1 and the output is greater than 0.5 (or when the expected result is 0 and the output is less than 0.5)
- When you have many output neurons, each representing a class, and you take the output with the maximun values as the "winning" class. You can implement your function to return 1 if the "winning class" in the output is the same as in the expected result

API & Usage Example

Usage Example

#include <stdio.h>
#include <stdlib.h>

#include "freeNN/feedforwardnetwork.h"

// Network constants
static const unsigned NUM_INPUTS = 2;
static const unsigned NUM_OUTPUTS = 4;
static const unsigned NUM_HIDDEN_LEVELS = 1;
static const unsigned NUM_NEURONS_IN_HIDDEN_LEVELS[] = {6};
static const int WITH_BIAS = 1;
static const double TRAINING_RATE = 0.08;
static const double OUTPUT_TRAINING_RATE = 0.01;
static const NeuronActivationFunction HIDDEN_LAYER_TYPE = TANH;
static const NeuronActivationFunction OUTPUT_LAYER_TYPE = LINEAR;

static const unsigned NUM_EPOCHS = 10000;

int main() {
    // Defining network custom settings
    FeedForwardNetworkSettings *settings = FeedForwardNetworkSettings_new_default();
    settings->withBias_ = WITH_BIAS;
    settings->trainingRate_ = TRAINING_RATE;
    settings->outputLayerTrainingRate_ = OUTPUT_TRAINING_RATE;
    settings->hiddenLayerType_ = HIDDEN_LAYER_TYPE;
    settings->outputLayerType_ = OUTPUT_LAYER_TYPE;

    // Creating instance of the network
    FeedForwardNetwork *ffn = FeedForwardNetwork_new(NUM_INPUTS, NUM_OUTPUTS, NUM_HIDDEN_LEVELS, NUM_NEURONS_IN_HIDDEN_LEVELS, settings);
    double inputs[NUM_INPUTS];
    double expectedOutputs[NUM_OUTPUTS];
    double *results;
    int i;

    // Training 1st output neuron to function as AND
    //          2nd output neuron to function as OR
    //          3rd output neuron to function as XOR
        //          4th output neuron to a custon function (0,0 -> 9.128  , 0,1 -> -4.27  , 1,0 -> 3.927  , 1,1 -> -8.532)
    for (i = 0 ; i < NUM_EPOCHS ; i++) {
        inputs[0] = 0;
        inputs[1] = 0;
        expectedOutputs[0] = 0;
        expectedOutputs[1] = 0;
        expectedOutputs[2] = 0;
        expectedOutputs[3] = 9.128;
        FeedForwardNetwork_train(ffn, inputs, expectedOutputs);
        inputs[0] = 0;
        inputs[1] = 1;
        expectedOutputs[0] = 0;
        expectedOutputs[1] = 1;
        expectedOutputs[2] = 1;
        expectedOutputs[3] = -4.27;
        FeedForwardNetwork_train(ffn, inputs, expectedOutputs);
        inputs[0] = 1;
        inputs[1] = 0;
        expectedOutputs[0] = 0;
        expectedOutputs[1] = 1;
        expectedOutputs[2] = 1;
        expectedOutputs[3] = 3.927;
        FeedForwardNetwork_train(ffn, inputs, expectedOutputs);
        inputs[0] = 1;
        inputs[1] = 1;
        expectedOutputs[0] = 1;
        expectedOutputs[1] = 1;
        expectedOutputs[2] = 0;
        expectedOutputs[3] = -8.532;
        FeedForwardNetwork_train(ffn, inputs, expectedOutputs);
    }
    // Printing results
    printf("--------------------------------------------------------------------\n");
    printf("| Input   | AND Neuron  | OR Neuron   | XOR Neuron | Custom Neuron |\n");
    printf("--------------------------------------------------------------------\n");
    inputs[0] = 0;
    inputs[1] = 0;
    results = FeedForwardNetwork_activate(ffn, inputs);
    printf("| (0,0)   | %f    | %f    | %f   | %f      |\n", results[0], results[1], results[2], results[3]);
    printf("--------------------------------------------------------------------\n");
    free(results);
    inputs[0] = 0;
    inputs[1] = 1;
    results = FeedForwardNetwork_activate(ffn, inputs);
    printf("| (0,1)   | %f    | %f    | %f   | %f      |\n", results[0], results[1], results[2], results[3]);
    printf("--------------------------------------------------------------------\n");
    free(results);
    inputs[0] = 1;
    inputs[1] = 0;
    results = FeedForwardNetwork_activate(ffn, inputs);
    printf("| (1,0)   | %f    | %f    | %f   | %f      |\n", results[0], results[1], results[2], results[3]);
    printf("--------------------------------------------------------------------\n");
    free(results);
    inputs[0] = 1;  
    inputs[1] = 1;
    results = FeedForwardNetwork_activate(ffn, inputs);
    printf("| (1,1)   | %f    | %f    | %f   | %f      |\n", results[0], results[1], results[2], results[3]);
    printf("--------------------------------------------------------------------\n");
    free(results);

    // Calling the destructor for the network
    FeedForwardNetwork_destroy(ffn);

    return 0;
}

Example Outputs
* Your results might appear a little different, as the first initialization of the network is random

--------------------------------------------------------------------
| Input   | AND Neuron  | OR Neuron   | XOR Neuron | Custom Neuron |
--------------------------------------------------------------------
| (0,0)   | 0.000000    | -0.000000   | 0.000000   | 9.128000      |
--------------------------------------------------------------------
| (0,1)   | -0.000000   | 1.000000    | 1.000000   | -4.270000     |
--------------------------------------------------------------------
| (1,0)   | -0.000000   | 1.000000    | 1.000000   | 3.927000      |
--------------------------------------------------------------------
| (1,1)   | 1.000000    | 1.000000    | 0.000000   | -8.532000     |
--------------------------------------------------------------------

Perceptron Usage Example

#include <stdio.h>

#include "freeNN/perceptron.h"

int main() {
    // Constants
    const int TRAINING_ITERATIONS = 160;
    const int NUM_OF_INPUTS = 2;
    const double TRAINING_RATE = 0.2;

    const double ZERO_ZERO[] = {0, 0};
    const double ZERO_ONE[]  = {0, 1};
    const double ONE_ZERO[]  = {1, 0};
    const double ONE_ONE[]   = {1, 1};

    int i;
    // Creating Perceptron instances
    Perceptron *pAND = Perceptron_new(NUM_OF_INPUTS, TRAINING_RATE);
    Perceptron *pOR  = Perceptron_new(NUM_OF_INPUTS, TRAINING_RATE);

    // Printing the results of the randomly generated perceptrons (BEFORE TRAINING)
    printf("Results for 'OR' perceptron, BEFORE training:\n");
    printf("Input: (0,0). Result: %f\n", Perceptron_getResult(pOR, ZERO_ZERO));
    printf("Input: (0,1). Result: %f\n", Perceptron_getResult(pOR, ZERO_ONE));
    printf("Input: (1,0). Result: %f\n", Perceptron_getResult(pOR, ONE_ZERO));
    printf("Input: (1,1). Result: %f\n", Perceptron_getResult(pOR, ONE_ONE));

    printf("Results for 'AND' perceptron, BEFORE training:\n");
    printf("Input: (0,0). Result: %f\n", Perceptron_getResult(pAND, ZERO_ZERO));
    printf("Input: (0,1). Result: %f\n", Perceptron_getResult(pAND, ZERO_ONE));
    printf("Input: (1,0). Result: %f\n", Perceptron_getResult(pAND, ONE_ZERO));
    printf("Input: (1,1). Result: %f\n", Perceptron_getResult(pAND, ONE_ONE));

    // Training each of the perceptrons
    for (i = 0 ; i < TRAINING_ITERATIONS ; i++) {
        Perceptron_train(pAND, ZERO_ZERO, 0);
        Perceptron_train(pAND, ZERO_ONE,  0);
        Perceptron_train(pAND, ONE_ZERO,  0);
        Perceptron_train(pAND, ONE_ONE,   1);

        Perceptron_train(pOR, ZERO_ZERO,  0);
        Perceptron_train(pOR, ZERO_ONE,   1);
        Perceptron_train(pOR, ONE_ZERO,   1);
        Perceptron_train(pOR, ONE_ONE,    1);
    }

    // Printing the results of the trained perceptrons (AFTER TRAINING)
    printf("Results for 'OR' perceptron, AFTER training:\n");
    printf("Input: (0,0). Result: %f\n", Perceptron_getResult(pOR, ZERO_ZERO));
    printf("Input: (0,1). Result: %f\n", Perceptron_getResult(pOR, ZERO_ONE));
    printf("Input: (1,0). Result: %f\n", Perceptron_getResult(pOR, ONE_ZERO));
    printf("Input: (1,1). Result: %f\n", Perceptron_getResult(pOR, ONE_ONE));

    printf("Results for 'AND' perceptron, AFTER training:\n");
    printf("Input: (0,0). Result: %f\n", Perceptron_getResult(pAND, ZERO_ZERO));
    printf("Input: (0,1). Result: %f\n", Perceptron_getResult(pAND, ZERO_ONE));
    printf("Input: (1,0). Result: %f\n", Perceptron_getResult(pAND, ONE_ZERO));
    printf("Input: (1,1). Result: %f\n", Perceptron_getResult(pAND, ONE_ONE));

    return 0;
}

Perceptron Example Results

Results for 'OR' perceptron, BEFORE training:
Input: (0,0). Result: 0.000000
Input: (0,1). Result: 1.000000
Input: (1,0). Result: 1.000000
Input: (1,1). Result: 1.000000
Results for 'AND' perceptron, BEFORE training:
Input: (0,0). Result: 0.000000
Input: (0,1). Result: 0.000000
Input: (1,0). Result: 0.000000
Input: (1,1). Result: 0.000000
Results for 'OR' perceptron, AFTER training:
Input: (0,0). Result: 0.000000
Input: (0,1). Result: 1.000000
Input: (1,0). Result: 1.000000
Input: (1,1). Result: 1.000000
Results for 'AND' perceptron, AFTER training:
Input: (0,0). Result: 0.000000
Input: (0,1). Result: 0.000000
Input: (1,0). Result: 0.000000
Input: (1,1). Result: 1.000000

FeedForward Network API

/*
Constructor (Use settings=NULL for default settings)
*/
FeedForwardNetwork *FeedForwardNetwork_new(unsigned numInputs, unsigned numOutputs, unsigned numHiddenLevels, const unsigned *numNeuronsInHiddenLevels, FeedForwardNetworkSettings *settings);

/*
Constructor from string
(An instance of FeedForward Network can be saved into a string -> This string can be used to construct another identical network)
*/
FeedForwardNetwork *FeedForwardNetwork_new_from_string(const char *networkStr);

/*
Constructor from file
Similar to the "from String" constructor
*/
FeedForwardNetwork *FeedForwardNetwork_new_from_file(const char *filePath);

/*
Destructor
*/
void FeedForwardNetwork_destroy(FeedForwardNetwork *network);

/*
Train the network for given input vector and desired output vector
*/
void FeedForwardNetwork_train(FeedForwardNetwork *network, const double *inputs, const double *expectedOutputs);

/*
Train the network for given input vector, a desired output vector.
This function is faster than the regular "train" function, as it enables user to supply a function, which decides whether to update the network or not, according to the desired output and the actual output
*/
int FeedForwardNetwork_train_fast(FeedForwardNetwork *network, const double *inputs, const double *expectedOutputs, int (*binaryChecker)(const double*, const double*));

/*
Activate the network on a given input vector (I.e. get the values of the output level neurons)
*/
double *FeedForwardNetwork_activate(const FeedForwardNetwork *network, const double *inputs);

/* 
Get a string which represents the network (Can be used to clone the network)
*/
char *FeedForwardNetwork_toString(const FeedForwardNetwork *network);

/*
Save the string which represents the network to a file
*/
int FeedForwardNetwork_saveToFile(const FeedForwardNetwork *network, const char *pathToFile);

/*
Get a new default instance of FeedForwardNetworkSettings
*/
FeedForwardNetworkSettings *FeedForwardNetworkSettings_new_default();

Perceptron API

/*
Constructor
*/
Perceptron *Perceptron_new(unsigned numOfInputs, double trainingRate);

/*
Constructor from a string (A perceptron object can be saved into a string -> can use string to create another identical object)
*/
Perceptron *Perceptron_new_from_string(const char *perceptronStr);

/*
Destructor
*/
void Perceptron_destroy(Perceptron *perceptron);

/*
Get the sum of [inputs * weights] 
*/
double Perceptron_getValue(const Perceptron *perceptron, const double inputs[]);

/*
Change the training rate
*/
void Perceptron_setTrainingRate(Perceptron *perceptron, double trainingRate);

/*
Train the perceptron on a given input vector and a result
*/
void Perceptron_train(Perceptron *perceptron, const double inputs[], int expectedResult);

/*
Get the activation value result
*/
double Perceptron_getResult(const Perceptron *perceptron, const double inputs[]);

/*
Get the weight at the i'th index
*/
double Perceptron_getWeightAt(const Perceptron *perceptron, unsigned index);

/*
Get the weights vector
*/
const double *Perceptron_getWeights(const Perceptron *perceptron);

/*
Get the length of the input vector
*/
unsigned Perceptron_getNumOfInputs(const Perceptron *perceptron);

/*
get the Perceptron step-function threshold
*/
double Perceptron_getThreshold(const Perceptron *perceptron);

/*
Get the training rate
*/
double Perceptron_getTrainingRate(const Perceptron *perceptron);

/*
Change the weight of the i'th input
*/
void Perceptron_setWeightAt(Perceptron *perceptron, unsigned index, double weight);

/*
Override the weights vector with another one
*/
void Perceptron_setWeights(Perceptron *perceptron, const double *weights);

/*
Change the Perceptron step-function threshold
*/
void Perceptron_setThreshold(Perceptron *perceptron, double threshold);

/*
Convert a Perceptron instance into string (Useful for saving/loading Perceptrons and for deeply cloning them)
*/
char *Perceptron_toString(const Perceptron *perceptron);

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.