• Nobody/Anonymous


    i have a question about th GAdaBoost Implementation.

    The AdaBoost algorithm computes different distributions for the training
    examples and gives them to the Learner for its training. Thus, the Learner has
    to provide a training method with weighted features.

    IMHO this is not implemented in GAdaBoost.

    My question is, if this is a special version of the AdaBoost Algorithm, or if
    i did not understand it correctly.

  • Nobody/Anonymous

    You are right. I approximated weighted features by resampling the training
    set. That is, I draw each pattern with probability proportional to its weight.
    This is not technically equivalent to AdaBoost, it's just an approximation,
    but it works with any algorithm, not just those that support weighted
    patterns. Perhaps I should rename it to remove confusion. Any suggestions? How
    about GeneralizedAdaBoost? (I'll bet someone has already published something
    about this approach, but I haven't seen it.)

  • Nobody/Anonymous

    Thank you for answering. This solution sounds interesting. An evaluation of
    the difference in errors would be interesting here.

    In a short web-search, I didn't find an AdoBoost variant with resampling.

    I propose to simply name it ResamplingAdaBoost. GeneralizedAdaBoost IMHO
    suggests, that you just have to set a parameter correctly, to get the original
    AdaBoost variant.

  • Nobody/Anonymous

    That sounds good. I'll renamed it to ResamplingAdaBoost.

  • Nobody/Anonymous

    Hello again,

    i want to report a small mistake:

    During Deserialization of GResamplingAdaBoost , it isn't recognized.

    Althoug you changed the name of the class, the GResamplingAdaBoost is still
    sorted to the Classes with smaller letter than j in GLearner.cpp line 1395.

    A small resorting would do it.


  • Mike Gashler

    Mike Gashler - 2012-02-06

    Thanks for catching this. I have fixed it now.



Cancel  Add attachments

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

JavaScript is required for this form.

No, thanks