>>>From: zhong jiahang <Jiahang.Zhong@cern.ch>
>>> To: Peter Speckmayer <peter.speckmayer@cern.ch>
>>> Hi Peter,
>>>
>>> Thanks for this information. But I think the SplitMode may not work as
>>> expected.
>>>
>>> In my input I have five weights value, each contain approximately the
>>> same number of entries. What I observed is that with
>>> "SplitMode=Alternate", the # of training events for each weights are
>>> basically even distributed as expected. However, for "SplitMode=Random"
>>> and "SplitMode=Block", I got the identical weight distribution, ~40% for
>>> each of the first two weight value, and ~20% for the third one. This is
>>> expected for "Block", but I think for "Random" I should instead get an
>>> even distributed training sample?
>>>
Hi, I can reproduce this problem. However, it only occurs if one of the NSigTrain/Test NBkgTrain/Test event numbers in the Factory::PrepareTrainingAndTestTree() call is set to zero. It will not occur if all four event numbers are specified.