Menu

CMN problem pockesphinx

2010-10-18
2012-09-22
  • Mirko Budric

    Mirko Budric - 2010-10-18

    Hello.

    I have some experience with S4 and now I am switching to PS.

    I have discovored some strange things. I am testing some acoustic models (8kHz
    telephone) for the PS models. When I used the wave2feat extractor I achieved
    97% ACC using the default Training setup (1000 senos, 8 gaussians).

    But when I have used the previously developed frontend the ACC droped to 87 %
    (used for training and testing).
    I was able to reapair the bad ACC by modyfying the language weight paramter
    from 6.5 to 2.1. Then the ACC raised again to 96 %. Now using this setup and
    the wave2feat features i get almost 98.8 % acc.
    Why does the -lw parameter influence the result so much, as I am using a JSGF
    grammar for testing isolated words.

    The other thing I have noticed is that when I set cmn from none to current the
    ACC drops to 95 %. And I get the same results for PS witth and withous CMN.
    The result stays 95 % for current and for nose for acoustic modles build with
    cmn=current.

    Should I use the PS feature extractor and does CMN current work for live
    operations?

    Man thank you for any answer.

    Best regards,

    Mirko

     
  • Nickolay V. Shmyrev

    Hello

    Which version of pocketsphinx are you talking about? I believe there were a
    regression recently that could cause such problem.

    As for CMN, are you sure you have enough data for training the model? It
    doesn't look so.

     
  • Mirko Budric

    Mirko Budric - 2010-10-20

    I am using ps 0.6.1 the latest release....

    I am using more then 40 h of material.

    And with S4 I noticed around 2-3 % absolute improvement. But here the ACC
    drops. But what is more strange is that I get the same ACC with ot without CMN
    enabled in pocket sphinx. Normaly I think I should get a very bad ACC when the
    features don't match.

    Thanks Nickolay

     
  • Nickolay V. Shmyrev

    Hello Mirko

    Sorry for late reply. Unfortunately it's hard to say something meaningful
    except "it shoudln't be so". Can you try older pocketsphinx version in order
    to check if it's a regression for example? If you'll find the change that
    caused that it would be helpful. We also need to setup some meaningful test
    for JSGF decoding using pocketsphinx. I also noticed that language weight
    influences decoding.

    . I am using more then 40 h of material.

    Why do you train only 1000 senones then?

    t what is more strange is that I get the same ACC with ot without CMN
    enabled in pocket sphinx. Normaly I think I should get a very bad ACC when the
    features don't match

    I think pocketsphinx uses CMN configured in model in feat.params. It ignores
    your option.

     

Log in to post a comment.