I've only had a chance to look through the jgprog code fairly casually, but it looks really good. Good work! I did a LISP implementation of GP several years ago, but at the time I didn't have the processing power to breed anything much better than the function f=x^2, and I haven't dusted off the code since.
I've had an interest since that time in using GP to breed neural nets. I actually started a Java implementation a while ago, but that project stalled out before it got too far. Partially due to time constraints and partially because I was teaching myself Java as I went. Anyway, I'm pretty happy to have found jgprog. I believe I could use the the system to evolve ANNs. I would need to add a Node subclass representing a weighted link as well as the code to actually evaluate the fitness of an ANN, but in general the process of evolving a neural network is not too different from evolving a program tree. There are two different processes I'd like to implement:
1 Teaching an ANN directly with GP, evolving the weights of links.
2 Using GP to evolve only the structure of the ANN. The weights of the links would not be part of the evolution process, only what nodes link to what nodes. The fitness function would actually train the network using a conventional ANN learning algorithm (e.g. backprop) and the fitness would be a measure of how well the network did after a certain number of training iterations. In this way, the ANN structures that learned the fastest and best would be selected/crossbred for the next GP population.
It's actually #2 that interests me the most, but I believe running this will take a long time and a lot of resources since the fitness function itself is a complex, lengthy process that would have to happen for every individual in each generation. I'm not sure if this is an original idea or not (probably not, but regardless, I'd like to implement it).
Anyway, let me know if any of this interests you in any way. And again, cool program!
bg
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Well, I'm glad you like the program! I'm regularly working on it to make it go faster and do more, so I hope this platform will be useful to you.
I have read a few papers dealing with evolving NN's through GP. Most of them take the form of evolving the structure of the NN, and paying less attention to the weights. Effectively the chromosome evolved by GP is the genotype while the NN is the phenotype. The functions that operate on weights are usually very simple, things like "invert", "strengthen", and "weaken".
I haven't read anything that's identical to what you suggest, which is fixing the structure and evolving the weights, or evolving the structure and fixing the weights (although that's very close to what I described above).
It sounds like a neat (and might I add, Groovy) thing to implement. All you'd need is a set of functions and terminals, plus the fitness function.
I'm currently working on a way to make the functions and terminals evaluate faster (by effectively inlining them into straight bytecode), but there's not much I can do to speed up a fitness function. My experience has been to evaluate one generation and have the program stop, and spit out profiling data via hprof (details on the Sun javasoft site -- search for "hprof"). Then I can optimize the fitness function. I usually can cut the time of evaluation by as much as 50% by eliminating some "dumb" things in the fitness function.
I'm also working on getting the engine to work distributed, so that if you find your fitness functions are CPU-intensive and there's nothing you can do about it, you can let other computers run portions of the population.
There are so many Groovy things that need to be done, but I'm the only developer :( Maybe I'll use this new SourceForge Jobs thing to attract developers...
Later,
--Rob
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Thanks for the input and info. I'm going to need to spend some time getting myself back up to speed on a few things I haven't thought about in a while, so I may eventually have a few dumb questions for you about jgprog- if that's okay. I'm actually having trouble compiling the source with the gnu bytecode stuff and all, but I'm sure I'm not doing it quite right. I still have pretty limited experience with Java- although I really like it- so I'm going to be poking through the JDK docs a bit. I'll let you know how it goes.
bg
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
There's actually been quite a lot of work on using GP to evolve ANNs. Probably the best know work is Frederic Gruau's cellular encoding work; his "Genetic micro programming of neural networks" (Advanced in GP) is a good place to start.
Gruau uses GP to evolve both the structure and the weights in a very interesting way. There have, however, been people that have used both GA and GP to do things like evolve weights for fixed structures and the like.
Bill Langdon's massive GP bibliography (http://www.cs.bham.ac.uk/~wbl/biblio/gp-bibliography.html) should give you plenty of useful pointers.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I've only had a chance to look through the jgprog code fairly casually, but it looks really good. Good work! I did a LISP implementation of GP several years ago, but at the time I didn't have the processing power to breed anything much better than the function f=x^2, and I haven't dusted off the code since.
I've had an interest since that time in using GP to breed neural nets. I actually started a Java implementation a while ago, but that project stalled out before it got too far. Partially due to time constraints and partially because I was teaching myself Java as I went. Anyway, I'm pretty happy to have found jgprog. I believe I could use the the system to evolve ANNs. I would need to add a Node subclass representing a weighted link as well as the code to actually evaluate the fitness of an ANN, but in general the process of evolving a neural network is not too different from evolving a program tree. There are two different processes I'd like to implement:
1 Teaching an ANN directly with GP, evolving the weights of links.
2 Using GP to evolve only the structure of the ANN. The weights of the links would not be part of the evolution process, only what nodes link to what nodes. The fitness function would actually train the network using a conventional ANN learning algorithm (e.g. backprop) and the fitness would be a measure of how well the network did after a certain number of training iterations. In this way, the ANN structures that learned the fastest and best would be selected/crossbred for the next GP population.
It's actually #2 that interests me the most, but I believe running this will take a long time and a lot of resources since the fitness function itself is a complex, lengthy process that would have to happen for every individual in each generation. I'm not sure if this is an original idea or not (probably not, but regardless, I'd like to implement it).
Anyway, let me know if any of this interests you in any way. And again, cool program!
bg
Hi, bg...
Well, I'm glad you like the program! I'm regularly working on it to make it go faster and do more, so I hope this platform will be useful to you.
I have read a few papers dealing with evolving NN's through GP. Most of them take the form of evolving the structure of the NN, and paying less attention to the weights. Effectively the chromosome evolved by GP is the genotype while the NN is the phenotype. The functions that operate on weights are usually very simple, things like "invert", "strengthen", and "weaken".
I haven't read anything that's identical to what you suggest, which is fixing the structure and evolving the weights, or evolving the structure and fixing the weights (although that's very close to what I described above).
It sounds like a neat (and might I add, Groovy) thing to implement. All you'd need is a set of functions and terminals, plus the fitness function.
I'm currently working on a way to make the functions and terminals evaluate faster (by effectively inlining them into straight bytecode), but there's not much I can do to speed up a fitness function. My experience has been to evaluate one generation and have the program stop, and spit out profiling data via hprof (details on the Sun javasoft site -- search for "hprof"). Then I can optimize the fitness function. I usually can cut the time of evaluation by as much as 50% by eliminating some "dumb" things in the fitness function.
I'm also working on getting the engine to work distributed, so that if you find your fitness functions are CPU-intensive and there's nothing you can do about it, you can let other computers run portions of the population.
There are so many Groovy things that need to be done, but I'm the only developer :( Maybe I'll use this new SourceForge Jobs thing to attract developers...
Later,
--Rob
Thanks for the input and info. I'm going to need to spend some time getting myself back up to speed on a few things I haven't thought about in a while, so I may eventually have a few dumb questions for you about jgprog- if that's okay. I'm actually having trouble compiling the source with the gnu bytecode stuff and all, but I'm sure I'm not doing it quite right. I still have pretty limited experience with Java- although I really like it- so I'm going to be poking through the JDK docs a bit. I'll let you know how it goes.
bg
OK, no problem. If you need some help with compiling the stuff, just ask.
--Rob
There's actually been quite a lot of work on using GP to evolve ANNs. Probably the best know work is Frederic Gruau's cellular encoding work; his "Genetic micro programming of neural networks" (Advanced in GP) is a good place to start.
Gruau uses GP to evolve both the structure and the weights in a very interesting way. There have, however, been people that have used both GA and GP to do things like evolve weights for fixed structures and the like.
Bill Langdon's massive GP bibliography (http://www.cs.bham.ac.uk/~wbl/biblio/gp-bibliography.html) should give you plenty of useful pointers.