Ask questions here

Help
2003-11-19
2012-12-25
1 2 3 > >> (Page 1 of 3)
  • Steffen Nissen

    Steffen Nissen - 2003-11-19

    Please ask any questions here, I am monitoring this forum, and will try to answer all questions.

    Regards,
    Steffen

     
    • neyro

      neyro - 2004-06-16

      I'm trying to write a sw that helps pattern recognition using a nn to decide (or guess) where a particular item is located in the image.

      To do this I generate 944 input values (color averages of small areas if the original image) and I need 4 outputs (up down left right) to decide where my "virtual eye" should move.

      For every learning session I have 50 to 100 input sequences.

      Is it possible to use a nn for such a large amount of data?

       
    • Martin Prochazka

      I would like to develop unsupervised learning method using FANN in Python. I'm looking for best library to do that and FANN looks good. But in neuron structure documentation I found :
      This structure is subject to change at any time. If you require direct access to the contents of this structure, you may want to consider contacting the FANN development team.

      Properties

      fann_type * weights

          This property is not yet documented.
      struct fann_neuron ** connected_neurons

          This property is not yet documented.
      ...

      which seems like it's not usual to make any operations on neural network. I need to update only weights and from Python. Is it possible?

      Thank you for answering

       
    • Washuu

      Washuu - 2005-01-27

      hi, I'm just starting adventure with NN.

      I'm trying to learn my ann 4 bit multiplication. I tried with 1 and 2 hidden layers, and many numbers of neurons, and couldn't get error below 1. Did anyone tried something like that? Any suggestions?

      Of course i know the results won't be in {0,1} set, but float.
      Snippet from my definition file:

      256 8 8
      0 0 0 0 0 0 0 0
      0 0 0 0 0 0 0 0
      ...
      1 0 1 1 0 0 1 0
      0 0 0 1 0 1 1 0
      1 1 0 0 0 0 1 0
      0 0 0 1 1 0 0 0
      ...
      1 1 1 1 1 1 1 1
      1 1 1 0 0 0 0 1

       
    • chandra

      chandra - 2005-02-13

      hi, i am uncleared about Fuzzy ARTMAP.Can u please suggest which website or journal gives me clear idea.i am doing little project using Fuzzy ARTMAP.

       
    • zarzyk3

      zarzyk3 - 2005-02-18

      Hi!
      Firstly,sorry for starting new thread.I found this thread later.
      I use djgpp with Rhide IDE under WinXP and want to use fann library,but I have problem with installing it.How should I do this??
      Thanks for any help

       
    • JEI

      JEI - 2005-02-25

      I'm running a neural network using FANN with 25 inputs and 2 outputs. I was recieving valid results wiht 3 layers. Another developer in my area using the same data and another NN API is using 4 layers and achieved better results. I attempted to use 4 layers and found that the reported MSE's were rediculously high (MUCH greater than 1 on a double float NN). The netwrok trained but botoned out at about MSE=4.111...

      Can anyone tell me why this would happen?

       
      • Thomas Johnson

        Thomas Johnson - 2005-02-26

        Maybe you're using a different learning method? A different learning rate (or other parameter value if you're using non-backprop)? A different set of connections (i.e., fully vs. layer-to-layer connectivity)? Are you both calculating the MSE in the same way? I believe FANN adds the errors of each output neuron to determine MSE, while other package might report the average error.

         
        • JEI

          JEI - 2005-02-26

          We are using similar, if not the same learning methods. Our learnign rates are both scalar (they should change over tiem to keep from over-fitting). Both networks are fully connected. The issue isn't that we're seeing differend results.

          We both achieve similar results with 3 layers.

          THE ISSUE IS that he can achieve results with 4 layers, while FANN cannot. there is NO way that given  two outputs (even if added together)  that the MSE could equal 4. Assuming that both outputs were exactly wrong (expects two 0s and got two 1s) then the MSE would equal 2. how am I possibly getting 4 and greater?

           
          • Thomas Johnson

            Thomas Johnson - 2005-02-27

            I have no idea how you're getting 4 and greater. Why don't you start up ddd/gdb and trace through the function where the MSE is calculated, it's probably not too hard.

             
            • Steffen Nissen

              Steffen Nissen - 2005-02-28

              Try sending your code to this forum and I will have a look at it. Also if you could send your training data.

              Two common mistakes are:
              - Forgetting to set the sizes of each layer in the fann_create function (this function takes a variable amount of arguments)
              - wrongly formatted training file. There is not much check for how data is represented, so if you e.g. have one number too many in one line, then the order of the data will be screwed up and input can become output etc.

              A good way of testing if there are any problems, is having a loop over all of the training data and then reset the mse inside the loop, so that you can print out the mse for the individual training data. This way you can see where it goes wrong.

               
              • JEI

                JEI - 2005-03-01

                #include <stdio.h>
                #include <string.h>
                #include <unistd.h>
                #include <doublefann.h>

                void execute_FANN(int*, double*);

                // input filename, output filename, field to pull for testing.
                double* pull_test(char[], char[], int);

                // input filename only. will be edited internally
                int* normalize_data(char[]);

                #define CONNECTION_RATE 1.0
                #define LEARNING_RATE 0.4
                #define INPUT 25
                #define OUTPUT 2
                #define LAYERS 4
                #define HIDDEN_NEURONS 9
                #define DESIRED_ERROR 0.0001
                #define MAX_ITERATIONS 700000
                #define ITERATIONS_BETWEEN_REPORTS 500

                int main (int argc, char *argv[]) {
                        struct fann *ann;
                        struct fann_train_data *data;
                        int *FIO, field;
                        char file[256], test_file[256];
                        double *test_field;

                        strncpy(file, argv[1], 256);

                        if ((FIO = normalize_data(file)) && (FIO[0] == 0)) {
                                printf("Error normalizing %s.\n", argv[1]);
                                return 1;
                        }

                        for (field = 0; field < FIO[0]; field++) {
                                test_field = pull_test(file, test_file, field);

                                ann = fann_create(CONNECTION_RATE, LEARNING_RATE, LAYERS, INPUT, HIDDEN_NEURONS, OUTPUT);
                                data = fann_read_train_from_file(test_file);

                                fann_set_activation_function_output(ann, FANN_SIGMOID);
                                fann_set_activation_function_hidden(ann, FANN_SIGMOID);
                                fann_set_training_algorithm(ann, FANN_TRAIN_BATCH);

                                printf("------------------------------------------------------------");
                                printf("\nTraining FANN without field %d of %d. (From file: %s)\n\n", field+1, FIO[0], file);

                                fann_train_on_data(ann, data, MAX_ITERATIONS, ITERATIONS_BETWEEN_REPORTS, DESIRED_ERROR);

                                fann_save(ann, "training.net");

                                fann_destroy(ann);
                                fann_destroy_train(data);

                                execute_FANN(FIO, test_field);

                                free(test_field);
                                unlink(test_file);
                        }

                        return 0;
                }

                void execute_FANN(int *FIO, double *test_field) {
                        struct fann *ann;
                        fann_type *output, *test_output;
                        fann_type *input;
                        int counter;

                        input = (fann_type *)malloc((FIO[1])* sizeof(fann_type));
                        for (counter = 0; counter < FIO[1]; counter++) {
                                input[counter] = test_field[counter];
                        }
                        output = (fann_type *)malloc((FIO[2])* sizeof(fann_type));
                        for (counter = 0; counter < FIO[2]; counter++) {
                                output[counter] = test_field[counter+FIO[1]];
                        }

                        ann = fann_create_from_file("training.net");

                        fann_set_activation_function_output(ann, FANN_SIGMOID);
                        fann_set_activation_function_hidden(ann, FANN_SIGMOID);

                        test_output = fann_test(ann, input, output);

                        printf("\nCurrent FANN run against missing data yields output:\n");
                        for (counter = 0; counter < FIO[2]; counter++) {
                                printf("\tOutput %d:\t%lf\t(Expected :\t%lf)\n", counter, test_output[counter], test_field[counter+FIO[1]]);
                        }
                        printf("\tMSE: %lf\n\n", fann_get_MSE(ann));

                        free(input);
                        free(output);
                        fann_destroy(ann);
                }

                double* pull_test(char in_file[], char out_file[], int field) {
                        FILE *IN = NULL, *OUT = NULL;
                        int fields, inputs, outputs, counter[2];
                        double *test_field, temp;

                        IN = fopen(in_file, "r");
                        if (IN == NULL) {return 0;}
                        strncpy(out_file, in_file, 256);
                        strncat(out_file, ".test", 256);
                        OUT = fopen(out_file, "w");
                        if (OUT == NULL) {return 0;}

                        fscanf(IN, "%d %d %d", &fields, &inputs, &outputs);
                        fprintf(OUT, "%d %d %d\n", fields-1, inputs, outputs);

                        test_field = (double *)malloc((inputs+outputs)* sizeof(double));

                        for(counter[0] = 0; counter[0] < fields; counter[0]++) {
                                if (field == counter[0]) {
                                        for (counter[1] = 0; counter[1] < inputs; counter[1]++) {
                                                fscanf(IN, "%lf", &temp);
                                                test_field[counter[1]] = temp;
                                        }
                                        for (counter[1] = 0; counter[1] < outputs; counter[1]++) {
                                                fscanf(IN, "%lf", &temp);
                                                test_field[counter[1]+inputs] = temp;
                                        }
                                } else {
                                        for (counter[1] = 0; counter[1] < inputs; counter[1]++) {
                                                fscanf(IN, "%lf", &temp);
                                                fprintf(OUT, "%.16lf ", temp);
                                        }
                                        fprintf(OUT, "\n");
                                        for (counter[1] = 0; counter[1] < outputs; counter[1]++) {
                                                fscanf(IN, "%lf", &temp);
                                                fprintf(OUT, "%.16lf ", temp);
                                        }
                                        fprintf(OUT, "\n");
                                }
                        }

                        fclose(IN);
                        fclose(OUT);

                        return test_field;
                }

                int* normalize_data (char file[]) {
                        FILE *IN = NULL, *OUT = NULL;
                        int fields, inputs, outputs;
                        int counter[2];
                        double *inputs_high, *inputs_low, *outputs_high, *outputs_low, temp;

                        static int FIO[3] = {0, 0, 0};

                        IN = fopen(file, "r");
                        if (IN == NULL) {return 0;}
                        strncat(file, ".norm", 256);
                        OUT = fopen(file, "w");
                        if (OUT == NULL) {return 0;}

                        fscanf(IN, "%d %d %d", &fields, &inputs, &outputs);
                        fprintf(OUT, "%d %d %d\n", fields, inputs, outputs);

                        inputs_high = (double *)malloc(inputs* sizeof(double));
                        inputs_low = (double *)malloc(inputs* sizeof(double));

                        outputs_high = (double *)malloc(outputs* sizeof(double));
                        outputs_low = (double *)malloc(outputs* sizeof(double));

                        for (counter[1] = 0; counter[1] < inputs; counter[1]++) {
                                fscanf(IN, "%lf", &temp);
                                inputs_high[counter[1]] = temp;
                                inputs_low[counter[1]] = temp;
                        }
                        for (counter[1] = 0; counter[1] < outputs; counter[1]++) {
                                fscanf(IN, "%lf", &temp);
                                outputs_high[counter[1]] = temp;
                                outputs_low[counter[1]] = temp;
                        }

                        for(counter[0] = 0; counter[0] < fields-1; counter[0]++) {
                                for (counter[1] = 0; counter[1] < inputs; counter[1]++) {
                                        fscanf(IN, "%lf", &temp);
                                        if (temp > inputs_high[counter[1]])
                                                inputs_high[counter[1]] = temp;
                                        else if (temp < inputs_low[counter[1]])
                                                inputs_low[counter[1]] = temp;
                                }
                                for (counter[1] = 0; counter[1] < outputs; counter[1]++) {
                                        fscanf(IN, "%lf", &temp);
                                        if (temp > outputs_high[counter[1]])
                                                outputs_high[counter[1]] = temp;
                                        else if (temp < outputs_low[counter[1]])
                                                outputs_low[counter[1]] = temp;

                                }
                        }

                        rewind(IN);
                        fscanf(IN, "%d %d %d", &fields, &inputs, &outputs);

                        for(counter[0] = 0; counter[0] < fields; counter[0]++) {
                                for (counter[1] = 0; counter[1] < inputs; counter[1]++) {
                                        fscanf(IN, "%lf", &temp);
                                        temp = (temp-inputs_low[counter[1]])/(inputs_high[counter[1]]-inputs_low[counter[1]]);
                                        fprintf(OUT, "%.16lf ", temp);
                                }
                                fprintf(OUT, "\n");
                                for (counter[1] = 0; counter[1] < outputs; counter[1]++) {
                                        fscanf(IN, "%lf", &temp);
                                        temp = (temp-outputs_low[counter[1]])/(outputs_high[counter[1]]-outputs_low[counter[1]]);
                                        fprintf(OUT, "%.16lf ", temp);
                                }
                                fprintf(OUT, "\n");
                        }

                        fclose(IN);
                        fclose(OUT);
                        free(inputs_high);
                        free(inputs_low);
                        free(outputs_high);
                        free(outputs_low);

                        FIO[0] = fields; FIO[1] = inputs; FIO[2] = outputs;

                        return FIO;
                }

                 
                • Steffen Nissen

                  Steffen Nissen - 2005-03-01

                  I can see that you create the network wrong.

                  If you want to create a 25-9-9-2 network you should call the fann_create like this:

                  ann = fann_create(CONNECTION_RATE, LEARNING_RATE, 4, 25, 9, 9, 4);

                  and not

                  ann = fann_create(CONNECTION_RATE, LEARNING_RATE, 4, 25, 9, 4);

                  This is due to the fact that fann_create takes a variable number of arguments and that it is not possible to test to see how many arguments are actually given.

                  Regards,
                  Steffen

                   
                  • JEI

                    JEI - 2005-03-01

                    what is the second 9 for? and why are you specifying 4 outputs?

                     
                  • JEI

                    JEI - 2005-03-01

                    ok I figured it out, thank you very much for hte help i'm now working with

                    ann= fann)create(CONNECTION_RATE, LEARNING_RATE, LAYERS, INPUTS, HIDDEN_LAYER1, HIDDEN_LAYER2, OUTPUTS)

                     
              • JEI

                JEI - 2005-03-01

                You should see in that source dode that there is an auto normalizatinog finction that gets run against the input file, which si expected to be in the normal formatt for a training data file, though the data does not have to be preformatted.

                I am not able to currently give out the test data.

                 
    • PP

      PP - 2005-02-27

      I had a similar issue with MSE giving some arbitrary value.

      Final MSE output of the network differed from run to run if I had some other application running in the background. If there was a large system load
      the learning  somehow  never went to acceptable
      levels.

      To see what all this parameters do to the network
      I connected a database table (firebird.org) to the network and captured every MSE value. This
      allows to easily classify the number of epochs it
      takes for any improvements.( I also would like to get
      all the weights of each neuron for each epoch, but
      have not figured out how, yet.)

      It appeared that in misbehaved runs , some internal
      buffer or cache remembered things from an earlier learning cycle and got data mixed up .

      I also could see the impact of the learning rate.
      Even with small changes there, the network learning
      went different paths or fizzeled out all together
      sometimes resulting in inexplicable MSE values.

      The same is true for layers , I get for some unexplained
      reason better results with 2,3,5 layers than with  4,6,7.
      However I should note that usually 2 layers are enough
      for most problems.

       
    • aluc4rd

      aluc4rd - 2005-03-01

      So i have explain my qustino in the topic : Compiling under dev c++
      so i followed the rules who give me tylenol :

      1) Create a new project (DLL C++)
      2) Add "floatfann.c" file (or fixed)
      3) Open project options (Alt+P)
      3.1) Tab Parameters->
      C++ compiler:
      -DWIN32
      -DNDEBUG
      -DFANN_DLL_EXPORTS
      -DUSE_WINDOWS_H
      Linker:
      -lkernel32
      3.2) Tab Directories->
      Include Directories:
      ..\src\include

      but when i compile :
      Compiler: Default compiler
      Building Makefile: "C:\Documents and Settings\veyrenche\Mes documents\remi\Makefile.win"
      Executing make...
      make.exe -f "C:\Documents and Settings\veyrenche\Mes documents\remi\Makefile.win" all
      dllwrap.exe --output-def libProject2.def --implib libProject2.a ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o -L"C:/Dev-Cpp/lib" --no-export-all-symbols --add-stdcall-alias -lkernel32 -o Project2.dll

      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x3be):xor_test.c: undefined reference to `fann_create_from_file'
      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x3f7):xor_test.c: undefined reference to `fann_read_train_from_file'
      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x41c):xor_test.c: undefined reference to `fann_reset_MSE'
      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x455):xor_test.c: undefined reference to `fann_test'
      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x530):xor_test.c: undefined reference to `fann_destroy_train'

      ../../Bureau/fann_win32_dll-1.2.0/fann-1.2.0/examples/xor_test.o(.text+0x53b):xor_test.c: undefined reference to `fann_destroy'

      dllwrap.exe: no export definition file provided.
      Creating one, but that may not be what you want
      dllwrap.exe: gcc exited with status 1

      make.exe: *** [Project2.dll] Error 1

      Execution terminated

      thx to answer, cheers

       
    • pmcdonnell9

      pmcdonnell9 - 2005-03-03

      I just picked up a copy of FANN and am attempting to develop a NN that will perform function approximation given a sparse observation of the state space.

      The package looks great but I appear to be having a problem where the training progress stalls out after about 1000 iterations. I have played around with setting the training algorithm, changing the number of layers, the connectedness of the network etc..., but I am running out of ideas. I was hoping you might push me in the right direction.

      The system I am looking at has 7 inputs and a single output. For the specific example the inputs can take values ranging between 0 and 100 (I have scaled these to range 0 - 1 with the same results).

      The avg absolute pct error for the training set often ends up around 50%....but as I said the improvements in the MSE appear to plateau VERY fast.

      FWIW I am using using the MS .NET IDE v 7.1 to build and compile.

      The typical training set is 150 - 200 examples.

      A subset of a sample data file is below:
      (do you anticipate any issue with mapping to a negative response?)

      150    7    1               
      0.43    0.33    0.66    0.78    0.01    0.28    0.98
      -9137016.123                       
      0.65    0.9    0.13    0.75    0.77    0.17    0.75
      -128313.2447                       
      0.55    0.18    0.72    0.8    0.04    0.82    0.74
      -9132227.851                       
      0.34    0.45    0.95    0.84    0.7    0.89    0.6
      -157659.163                       
      0.57    0.98    0.29    0.63    0.64    0.52    0.74
      -148179.8732                       
      0.09    0.62    0.91    1    0.61    0.44    0.72
      -151258.74                       
      0.97    0.2    0.33    0.6    0.93    0.61    0.7
      -150813.557                       
      0.97    0.29    0.35    0.56    0.81    0.62    0.46
      -144089.8042                       
      0.8    0.88    0.83    0.37    0.21    0.85    0.16
      -6757212.991                       
      0.87    0.4    0.71    0.84    0.35    0.45    0.87
      -149439.9292                       
      0.22    0.51    0.56    0.3    0.01    0.73    0.54
      -9125946.709                       
      0.32    0.67    0.08    0.31    0.07    0.38    0.46
      -10494476.7                       
      0.32    0.55    0.81    0.56    0.01    0.15    0.94
      -9131652.256                       
      0.32    0.31    0.51    0.28    0.55    0.21    0.71
      -125228.239                       
      0.79    0.31    0.42    0.59    0.57    0.8    0.21
      -1696029.72                       
      0.27    0.99    0.25    0.33    0.54    0.1    0.78
      -2041411.98                       
      0.22    0.29    0.99    0.42    0.4    0.61    0.03
      -9108079.313                       
      0.46    0.13    0.78    0.67    0.35    0.34    0.72
      -125828.16                       
      0.6    0.13    0.94    0.09    0.07    0.78    0.38
      -15552771.97                       
      0.78    0.15    0.42    0.76    0.77    0.68    0.71
      -151260.4442                       
      0.15    0.18    0.75    0.42    0.53    0.62    0.41
      -125936.0322                       
      0.55    0.11    0.37    0.46    0.87    0.24    0.82
      -136214.2371                       
      0.11    0.41    0.72    0.53    0.19    0.59    0.89
      -110919.9879                       
      0.24    0.81    0.6    0.09    0.45    0.21    0.89
      -806422.66                       
      0.98    0.74    0.5    0.07    0.35    0.44    0.28
      -4258941.744                       
      0.26    0.73    0.81    0.61    0.2    0.66    0.46
      -135819.7137                       
      0.44    0.08    0.8    0.1    0.02    0.5    0.75
      -18110916.7                       
      0.29    0.3    0.44    0.75    0.29    0.81    0.62
      -121559.53                       
      0.18    0.87    0.15    0.21    0.04    0.92    0.55
      -9348859.6                       
      0.33    0.19    0.78    0.52    0.08    0.72    0.4
      -9077948.436                       
      0.22    0.75    0.14    0.8    0.43    0.85    0.7
                 

      The latest version of the program is below:

      #include <stdio.h>               
      #include <stdlib.h>              
      #include "floatfann.h"
      #include "fann.h"

      int main(int argc, char **argv)
      {
      //    Simple NNetwork to learn response of GA output
      // 
      //    Grab command line parameters
      //  -d <data file>  name for input data file
      //  -n <network file> name for output network name
      //  -i number of inputs....
      //  -o number of outputs
      //  -t time limit for training
      //  -m avg target MSE for training set
      //  -M max MSE allowed for all outputs
      //    -MI max iterations
      //  -s stat output file

          int    arg,i,j,af;
          unsigned int k;
          char            datafile[50];
          char            networkfile[50];
          char            statfile[50];
          unsigned int    numinputs=0;
          unsigned int    numoutputs=0;
          unsigned int    maxiterations=500000;
          int                maxseconds=1000000;
          float            maxavgMSE=-1.0;
          float            maxindMSE=-1.0;
          fann_type        *result;
          FILE            *outfile;

          struct fann *an2;

          const float connection_rate = 1;
          const float learning_rate = 0.3;
          const unsigned int num_layers = 3;
          unsigned int num_neurons_hidden =2;
          const unsigned int iterations_between_reports = 1;
          struct fann_train_data *data;

          arg =0;
          strcpy(statfile,"None");
         
          while (arg < argc){

              if( strcmp("-d", argv[arg]) == 0 ) {
                  arg+=1;
                  strcpy(datafile,argv[arg]);
              }
              else if  (strcmp("-n", argv[arg]) == 0 ) {
                  arg+=1;
                  strcpy(networkfile,argv[arg]);
              }
              else if  (strcmp("-i", argv[arg]) == 0 ) {
                  arg+=1;
                  numinputs = atoi(argv[arg]);
              }
              else if  (strcmp("-o", argv[arg]) == 0 ) {
                  arg+=1;
                  numoutputs =atoi(argv[arg]);
              }
              else if  (strcmp("-t", argv[arg]) == 0 ) {
                  arg+=1;
                  maxseconds = atoi(argv[arg]);
              }
              else if  (strcmp("-m", argv[arg]) == 0 ) {
                  arg+=1;
                  maxavgMSE= atof(argv[arg]);
              }
              else if  (strcmp("-M", argv[arg]) == 0 ) {
                  arg+=1;
                  maxindMSE= atof(argv[arg]);
              }
              else if  (strcmp("-MI", argv[arg]) == 0 ) {
                  arg+=1;
                  maxiterations= atoi(argv[arg]);
              }
              else if  (strcmp("-s", argv[arg]) == 0 ) {
                  arg+=1;
                  strcpy(statfile,argv[arg]);
              }
              else {
                  printf ("Unrecongnized Commandline Argument %s\n",argv[arg]);
              }

              arg+=1;
          }

          num_neurons_hidden = 2*numinputs;

         
          an2 = fann_create(connection_rate,learning_rate,num_layers,numinputs,num_neurons_hidden,numoutputs);
         
          data = fann_read_train_from_file(datafile);   
          fann_reset_MSE(an2);
      //    fann_init_weights(an2,data);

         
          fann_set_activation_function_hidden(an2,FANN_SIGMOID);

          af = fann_get_activation_function_hidden(an2);

          printf(" Hidden Activation Function %d\n",af);

          fann_set_activation_function_output(an2,FANN_LINEAR);
          af = fann_get_activation_function_output(an2);
          printf(" Output Activation Function %d\n",af);
      printf ("Here");

          fann_set_activation_steepness_hidden(an2, 0.25);
          fann_set_activation_steepness_output(an2, 0.25);
      //    fann_set_training_algorithm(an2,FANN_TRAIN_QUICKPROP);

         
          fann_train_on_file(an2,datafile,maxiterations,iterations_between_reports,maxavgMSE);

         

          if (strcmp(statfile,"None")!=0){
              outfile = fopen(statfile,"w");

              for (j=0; j != data->num_data;j++){

                  for (k = 0; k< data->num_input; k++){
                      fprintf (outfile,"%4.1f, ",data->input[j][k]);
                  }
                  fprintf (outfile,"Original, %9.2f,  ",data->output[j][0]);
                  result = fann_run(an2,data->input[j]);
                  fprintf (outfile," Predicted,%9.2f, ", *result);
                  fprintf (outfile,"Error, %6.3f\n",(*result- data->output[j][0]) /data->output[j][0] );

              }

              fclose(outfile);
          }

          fann_save(an2,networkfile);

      //    fann_destroy(an2);
         
          return 0;
      }

      Any insights would be appreciated.

      Thanks

      Patrick

       
      • Steffen Nissen

        Steffen Nissen - 2005-03-04

        Hi,

        If you need to have negative output, you will have to use some of the symmetric activation functions like e.g. FANN_SIGMOID_SYMMETRIC.

        Regards,
        Steffen

         
    • ashot

      ashot - 2005-04-09

      Is it possible to evaluate arbitrary recurrent neural networks with fann?
      I'm looking for the evaluation only, not training.

       
      • Steffen Nissen

        Steffen Nissen - 2005-04-09

        Sorry, fann can not evaluate recurrent ANNs.

         
    • Carlos Pascual

      Carlos Pascual - 2005-07-20

      Hi,
      I am using FANN with python bindings and I would like to use a custom training stopping criterium.
      The use of train_on_data_callback() would be ideal, but I understand that since the python bindings are cretaed with SWIG, it is difficult to do so due to the passing of a pointer to a function...

      So I tried to use fann_train_epoch() :
      ******************************************
      #file demo.py
      import fann
      from libfann import  fann_train_epoch
      ann=fann.create(1.,.7,(2,4,1))
      data=fann.read_train_from_file("data.train")
      fann_train_epoch(ann,data)
      ************************************************
      Output:
      $ python demo.py
      Traceback (most recent call last):
        File "demo.py", line 7, in ?
          fann_train_epoch(ann,data)
      TypeError: Expected a pointer

      (The same ocurs if one try to use
      fann_train_on_data_callback() )

      Is there any other way of doing this ?

      Am I missing a simpler solution to avoid overfitting in python (I just want to use the MSE with of an independent test set for stopping the training).

      Thanks in advance!
      Carlos

       
      • Steffen Nissen

        Steffen Nissen - 2005-07-25

        Hi,

        I would recommend using the fann_train_epoch function inside a loop.

        http://leenissen.dk/fann/html/r685.html

        Regards,
        Steffen

         
        • Carlos Pascual

          Carlos Pascual - 2005-07-26

          > I would recommend using the fann_train_epoch
          > function inside a loop.

          That is exactly what I tried (as I said in my original post)... and the point is that I could not get fann_train_epoch to work using the python bindings (see the trivial example of my original post).

          By the way:
          I figured out a workaround which involves using:
          fann.train_on_data(traindata,1, 0, -1)
          as a substitute to fann_train_epoch() , ...but that is certainly not the "good" way.
          Instead, one should be able to use at least fann_train_epoch() , with the python bindings.

          Regards.

          Carlos

           
1 2 3 > >> (Page 1 of 3)

Log in to post a comment.