i'm a student in germany using your unbbayes library to work with bayesian
networks on java
and i like it very much!
A rather important task in my program should be the improvement of the
parameters by feedback
from a test person. So i was trying several things with the API and your GUI
to learn or improve a
network with training data. Because I haven't found any full working example,
some questions are still
open and so i would save much time, if you could help me. :)
some of my humble questions (regarding the "incremental learning" - plugin) :
does a training set look the same way as the samples, you can create from a network ?
what is a frontier set ?
what is a "compacted file" ?
is a "?" the right symbol for no information of the state of the node ?
can i "incrementally" add more training data and the bayesian net parameters improve ?
does "incremental learning" really mean, that the existing parameters are improved by the
training data (and the structure keeps the same) contrary to "learning", which
creates the
net only from training data (and needs a definition of the structure) ?
and my last question for now: is there an easy way to realize incremental
parameter learning with the API ?
so, maybe you can help me with a point of that list :)
a partly answer or example would help me too ;)
and i don't expect a detailed answer for the last question, because i already
experimented with the library
and it's a bit tricky but possible i think ;)
many many thanks and with best regards,
Erik
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
First, I'd like to thank you for your attention on UnBBayes project.
Despite I'm not a member of the UnBBayes' incremental learning team, I think I
can help you in some of the topics (I'm affraid I cannot answer all of your
questions here, because I'm not sure about the details of incremental learning
plugin either).
does a training set look the same way as the samples, you can create from a
network ?
Yes. The learning modules (the "basic" one, and the incremental learning too)
supports the "uncompressed" "ordinal" data set. You can create such "ordinal"
data sets using the sampling modules/plugins.
what is a "compacted file" ?
As far as I know, it is a data format similar to the "ordinal" data sets
(generated by sampling modules), but each repeating data is replaced by a
number, indicating how many times it is repeating. I think it would be more
straight-foward if we just say "compressed" data sets.
...Actually, I'm pretty unsure about the above definition, because I've never
used a compressed file myself...
is a "?" the right symbol for no information of the state of the node ?
Yes, but I've never tested it in the incremental learning plugin (I've only
used it in the "basic" learning plugin, once in a while).
can i "incrementally" add more training data and the bayesian net parameters
improve ?
does "incremental learning" really mean, that the existing parameters are
improved by the training data (and the structure keeps the same) contrary to
"learning", which creates the net only from training data (and needs a
definition of the structure) ?
Theoretically, that's supposed to be the difference between the "incremental"
learning plugin and the other "learning" plugin: the capability to gradually
improve your model by supplying more data input on a bayesian network
previously created/modeled.
I'm not sure if the "incremental" learning modifies the original structure
(I'm only sure that it do modify the parameters). The (basic) learning plugin
can perform parameter learning AND structure learning. Thus, the "basic"
learning plugin does not usually need a structure definition (if the data sets
are huge enough), as you said.
I do not believe there is an easy way to use incremental learning as a
"formal" API (because it contains some very old code, strictly bound to some
GUI components), but you can include the ".jar" file of any plugin (the JAR
located at the same folder of "plugin.xml") to your project, and
manipulate/extend it as any java class (e.g. using adapter/wrapper pattern).
I'd like to apologize about my lack of informations about codes... When I've
been working on incremental learning in order to convert it to a plugin, I've
treated it as a "black box"; only "wrapping" the original code to a plugin
format.
I hope I've helped you... Somehow...
Anyway, enjoy UnBBayes ;)
Sincerelly
Shou Matsumoto
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
i'm a student in germany using your unbbayes library to work with bayesian
networks on java
and i like it very much!
A rather important task in my program should be the improvement of the
parameters by feedback
from a test person. So i was trying several things with the API and your GUI
to learn or improve a
network with training data. Because I haven't found any full working example,
some questions are still
open and so i would save much time, if you could help me. :)
some of my humble questions (regarding the "incremental learning" - plugin) :
training data (and the structure keeps the same) contrary to "learning", which
creates the
net only from training data (and needs a definition of the structure) ?
and my last question for now: is there an easy way to realize incremental
parameter learning with the API ?
so, maybe you can help me with a point of that list :)
a partly answer or example would help me too ;)
and i don't expect a detailed answer for the last question, because i already
experimented with the library
and it's a bit tricky but possible i think ;)
many many thanks and with best regards,
Erik
Hello, Erik.
First, I'd like to thank you for your attention on UnBBayes project.
Despite I'm not a member of the UnBBayes' incremental learning team, I think I
can help you in some of the topics (I'm affraid I cannot answer all of your
questions here, because I'm not sure about the details of incremental learning
plugin either).
Yes. The learning modules (the "basic" one, and the incremental learning too)
supports the "uncompressed" "ordinal" data set. You can create such "ordinal"
data sets using the sampling modules/plugins.
As far as I know, it is a data format similar to the "ordinal" data sets
(generated by sampling modules), but each repeating data is replaced by a
number, indicating how many times it is repeating. I think it would be more
straight-foward if we just say "compressed" data sets.
...Actually, I'm pretty unsure about the above definition, because I've never
used a compressed file myself...
Yes, but I've never tested it in the incremental learning plugin (I've only
used it in the "basic" learning plugin, once in a while).
Theoretically, that's supposed to be the difference between the "incremental"
learning plugin and the other "learning" plugin: the capability to gradually
improve your model by supplying more data input on a bayesian network
previously created/modeled.
I'm not sure if the "incremental" learning modifies the original structure
(I'm only sure that it do modify the parameters). The (basic) learning plugin
can perform parameter learning AND structure learning. Thus, the "basic"
learning plugin does not usually need a structure definition (if the data sets
are huge enough), as you said.
I do not believe there is an easy way to use incremental learning as a
"formal" API (because it contains some very old code, strictly bound to some
GUI components), but you can include the ".jar" file of any plugin (the JAR
located at the same folder of "plugin.xml") to your project, and
manipulate/extend it as any java class (e.g. using adapter/wrapper pattern).
I'd like to apologize about my lack of informations about codes... When I've
been working on incremental learning in order to convert it to a plugin, I've
treated it as a "black box"; only "wrapping" the original code to a plugin
format.
I hope I've helped you... Somehow...
Anyway, enjoy UnBBayes ;)
Sincerelly
Shou Matsumoto