From: Aaron A. <aa...@cs...> - 2007-08-17 22:16:22
|
Apologies for the late response. See inline comments. On Wed, 15 Aug 2007, Haiyong Xu wrote: > I found current implementation of JBoost mainly focus on Alternating > Decision Tree (ADT), which means the weak learners are similar to > decision stumps. ADTrees are capable of using any weak learner. The nice thing about using decision stumps with ADTrees is that since each path is a conjunction, ADTrees are able to form quite a rich representation with a limited base class of hypotheses. See the original paper for more details: http://www.cse.ucsd.edu/~yfreund/papers/atrees.pdf > Is there any way to construct a boosting system based on another type of > weak learner, say neural networks? Thanks. In JBoost, a weak learner (typically something similar to a decision stump) can be replaced by just about anything, including neural networks. If you wanted to implement a neural network weak learner, you could mimic the existing weak learners at ./src/jboost/leaner/. The easiest weak learner to understand is InequalitySplitter.java (which is created by InequalitySplitterBuilder.java). A simple neural net implementation should be fairly straight forward in this framework. Note that there are a variety of reasons why boosting neural nets may not be the best idea. The margin analysis given by Schapire et al (1998, Annals of stats) gives generalization bounds in terms of the VC dimension of the base classifier, while these bounds are certainly quite lose, they do provide some intuition why boosting may not overfit. Also note that JBoost can boost decision trees with the command ./jboost ... -ATreeType ADD_ROOT_OR_SINGLES You also have the other options: -ATreeType type The type of ATree to create. There are several options: ADD_ALL Create a full ADTree (default) ADD_ROOT Add splits only at the root producing a glat tree. This is equivalent to boosting decision stumps ADD_SINGLES Create a decision tree ADD_ROOT_OR_SINGLES Create a linear combination of decision trees. This is equivalent to simultaneously growing boosted decision trees. An existing implementation of Boosting with neural nets (and many other weak classifiers) exists in WEKA. Last time I checked, neural nets were under the name of Multilayer Perceptron. Aaron |