Menu

tuning random forest

Help
Anonymous
2016-06-17
2016-06-30
  • Anonymous

    Anonymous - 2016-06-17

    Is it possible to tune parameters (such as splitting, depth etc) in C++? So far I can't find anything in the documentation, it seems like the forest only needs to be created, given the test set with labels set and calling the train method:

    myRandomForest.train ( features, labels );

    Is it correct or I've missed something? If so, how can I tune those parameters?

    Alternatively, perhaps there is a way to speed up the classification process at the expense of more false positives and print classification certainty?

     
  • Mike Gashler

    Mike Gashler - 2016-06-30

    The GRandomForest class does not provide access to setting parameters. (It should. I just have not implemented that, yet.) However, you can create an instance of GBag, and add instances of GDecisionTree to it. This will be identical to using GRandomForest (if you call "useRandomDivisions" on each tree), and will allow you to set other parameters on the GDecisionTree instances.

     

Anonymous
Anonymous

Add attachments
Cancel





Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.