Is it possible to tune parameters (such as splitting, depth etc) in C++? So far I can't find anything in the documentation, it seems like the forest only needs to be created, given the test set with labels set and calling the train method:
myRandomForest.train ( features, labels );
Is it correct or I've missed something? If so, how can I tune those parameters?
Alternatively, perhaps there is a way to speed up the classification process at the expense of more false positives and print classification certainty?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The GRandomForest class does not provide access to setting parameters. (It should. I just have not implemented that, yet.) However, you can create an instance of GBag, and add instances of GDecisionTree to it. This will be identical to using GRandomForest (if you call "useRandomDivisions" on each tree), and will allow you to set other parameters on the GDecisionTree instances.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
View and moderate all "Help" comments posted by this user
Mark all as spam, and block user from posting to "Discussion"
Is it possible to tune parameters (such as splitting, depth etc) in C++? So far I can't find anything in the documentation, it seems like the forest only needs to be created, given the test set with labels set and calling the train method:
myRandomForest.train ( features, labels );
Is it correct or I've missed something? If so, how can I tune those parameters?
Alternatively, perhaps there is a way to speed up the classification process at the expense of more false positives and print classification certainty?
The GRandomForest class does not provide access to setting parameters. (It should. I just have not implemented that, yet.) However, you can create an instance of GBag, and add instances of GDecisionTree to it. This will be identical to using GRandomForest (if you call "useRandomDivisions" on each tree), and will allow you to set other parameters on the GDecisionTree instances.