Menu

PocketSphinx MLLR adaptation problem

Help
2011-12-22
2012-09-22
  • Justin Chiu

    Justin Chiu - 2011-12-22

    According to http://cmusphinx.sourceforge.net/wiki/tutorialadapt

    I train a mllr matrix and wish to adapt on the same model (hub4wsj_sc_8k) in
    this tutorial. No matter I use -mllr or not, it has identical result which I
    think is strange. Even if the model change not much on mllr, it should have
    some difference? In the log there are also lines indicate that it loads the
    mllr:

    INFO: ps_mllr.c(66): Reading MLLR transformation file
    /net/dogwood/usr1/jchiu1/pocketsphinx/Justin_hub_mllr
    INFO: ms_gauden.c(198): Reading mixture gaussian parameter:
    /net/dogwood/usr1/jchiu1/pocketsphinx/hub4wsj_sc_8k/means
    INFO: ms_gauden.c(292): 1 codebook, 3 feature, size:
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(198): Reading mixture gaussian parameter:
    /net/dogwood/usr1/jchiu1/pocketsphinx/hub4wsj_sc_8k/variances
    INFO: ms_gauden.c(292): 1 codebook, 3 feature, size:
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(294): 256x13
    INFO: ms_gauden.c(354): 0 variance values floored

    I tried to modify the mllr matrix, make some value in it become 999999(So it
    "should" have effects on the model, since this modification is huge), but the
    result is still the same. Does anyone has idea about it?

     
  • Nickolay V. Shmyrev

    MLLR has no effect with semi-continuous models. You can try continuous model
    to test it.

     
  • Justin Chiu

    Justin Chiu - 2011-12-23

    May I ask the reason why it has no effect? I check the code and I see it
    modifies the mean and variance for the Gaussians, I know semi-continuous HMM
    uses the same set of Gaussians with different weight for different state. If
    we modified the Gaussian itself, it should at least change the hypothesis
    score? I don't know why the result is exactly the same.

     
  • Nickolay V. Shmyrev

    Hello

    Semi-continuous models are built from a very small amount of gaussians (256 *
    3) which are combined with different mixture weights to get senones. No matter
    how you change gaussians it's important how do you mix them not their values.
    That's why mixture weights adaptation is more important than MLLR

    Because MLLR doesn't make sense it's not even implemented in s2_semi_mgau.c.
    Only generic continuous model computation in ms_mgau actually uses MLLR data.

     
  • Justin Chiu

    Justin Chiu - 2011-12-26

    So the Semi-continuous HMM did not use ms_mgau.c?Cause the log file I posted
    on the first post indicates it loaded MLLR Matrix although I am using semi-
    continuous model. So the code still call ms_mgau although it only use the
    information from s2_semi_mgau? Is that the case?

     
  • Justin Chiu

    Justin Chiu - 2011-12-26

    Hmm, I've check the s2_semi_mgau.c

    Now I am thinking that they stored some value like

    s->topn_beam
    s->max_topn
    s->n_topn_hist
    s->topn_hist

    All of there are stored in the acmod, but where did other codes uses these
    values? Maybe it's because the mean and variance in modified by MLLR after
    they get these values, so it makes no difference whether we use MLLR or not...

     

Log in to post a comment.