Neuroph training issues

Help
2012-04-12
2012-12-24
  • Diana Hintea

    Diana Hintea - 2012-04-12

    Hi and thanks in advance for any help that you may provide.

    I started using Neuroph, both the studio version and the separate packages one. With the studio version I encountered the following problem. When I want to create the training set by loading it from a file, it always gives me 'File not found' error. Is there a specific format that needs to be used or perhaps the file should be placed in a specific sub folder?…

    The other issue when using the Neuroph libraries is that after I create the training set, when I print its content it appears to be null and the learning goes into an infinite loop. I am using Eclipse, if that's any help.

    Many thanks.

     
  • Zoran Sevarac

    Zoran Sevarac - 2012-04-13

    If you're getting 'File not found' mos likely you're using bad file path argument. If you show me the code I might be able to help you more.
    However from Neuroph Studio, you should\nt be getting this error since you must choose existing file.
    Required format is plain CSV (comma separated values), a text file with values separated with commas

    Regarding printing training set please provide code you're using to create and print training set and I'll be able to help you

    Zoran

     
  • Diana Hintea

    Diana Hintea - 2012-04-13

    Thanks for the reply.
    I managed to sort out the code, the error was coming from the fact that I wasn't normalizing the data.
    I still have one question. After training the network I test it on a different data set and I get values around 0.05 for all approximately 10000 inputs. It seems a bit strange. Maybe you can have a look over the code a bit and tell me if you see anything out of place. In the following code I simply train the network on the training data set and then test it on another and write the network outputs in a file.

    public class Prediction
    {
    public static void main(String args) throws IOException
    {   
            String inputFileNameTrain = TempEst.class.getResource("TrainData.txt").getFile();
            String inputFileNameTest = TempEst.class.getResource("TestData.txt").getFile();
           
            MultiLayerPerceptron neuralNet = new MultiLayerPerceptron(2, 1, 1);
           
            TrainingSet tempTrainingSet = TrainingSet.createFromFile(inputFileNameTrain, 2, 1, ",");
            TrainingSet tempTestingSet = TrainingSet.createFromFile(inputFileNameTest, 2, 1, ",");

            neuralNet.learn(tempTrainingSet);    
           
            System.out.println(testNeuralNetwork(neuralNet, tempTestingSet));
        }

        public static void testNeuralNetwork(NeuralNetwork neuralNet, TrainingSet<SupervisedTrainingElement> trainingSet) throws IOException
        {
            int i = 0;
            double output = new double;
           
            BufferedWriter writer = null;
            try
            {
            writer = new BufferedWriter(new FileWriter("src/outputNetwork.txt"));
          
            for(SupervisedTrainingElement trainingElement : trainingSet.elements())
            {
            neuralNet.setInput(trainingElement.getInput());
            neuralNet.calculate();
            double networkOutput = neuralNet.getOutput();

            output_ = networkOutput;
           
                            writer.write(String.format("%.6f",networkOutput));
                            writer.newLine();
                             writer.flush();
            }
            }
            catch(IOException ex)
            {
                ex.printStackTrace();
            }
            finally
            {
                if(writer!=null)
                {
                    writer.close();
                } 
            }
        }

    I would also like to know where in the code I can specify the learning rate and other characteristic parameters.

    Many thanks._

     

Log in to post a comment.

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:





No, thanks