Hey guys, I'm doing a neural network for college that has 10 inputs, 9 outputs and 1 million lines of dataset. I already noticed that the dataset doesn't influence the number of hidden layers, so I'm just using 10 and I have some questions if you guys don't mind answering. Does supervised or unsupervised influence on the time it takes to find the solution, I didn't quite understand the difference between them. Out of the learning rules, which one should I use (backpropagation, backpropagation with momentum, etc)? I read somewhere, there was a way to split a large set into smaller sets and test which is the best, for faster learning (not like crossvalidation, if I understood correctly, crossvalidation,takes small test sets, not small training sets). I can't find this option in Neuroph. Is there any advice at all you can give me, so that I can make this work? I mean it's kind of frustrating, I used a program to make that data which contains almost every possibility of inputs/outputs but I can't make the algorithm work.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
If you have target outputs use supervised, otherwise use unsupervised
Use backprop with momentum
Yes there is option, right click data set then 'create training and test subset'
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hey guys, I'm doing a neural network for college that has 10 inputs, 9 outputs and 1 million lines of dataset. I already noticed that the dataset doesn't influence the number of hidden layers, so I'm just using 10 and I have some questions if you guys don't mind answering. Does supervised or unsupervised influence on the time it takes to find the solution, I didn't quite understand the difference between them. Out of the learning rules, which one should I use (backpropagation, backpropagation with momentum, etc)? I read somewhere, there was a way to split a large set into smaller sets and test which is the best, for faster learning (not like crossvalidation, if I understood correctly, crossvalidation,takes small test sets, not small training sets). I can't find this option in Neuroph. Is there any advice at all you can give me, so that I can make this work? I mean it's kind of frustrating, I used a program to make that data which contains almost every possibility of inputs/outputs but I can't make the algorithm work.
If you have target outputs use supervised, otherwise use unsupervised
Use backprop with momentum
Yes there is option, right click data set then 'create training and test subset'