Menu

Sphinx-3 decoder max file length

2010-09-27
2012-09-22
  • Berker Batur

    Berker Batur - 2010-09-27

    Hi,
    I have to decode large .wav files (bigger than 2 min.) at one step and get
    n-best hypothesis.
    I have 3 questions about these.

    1-) Is there any negative effect of increasing S3_MAX_FRAMES in s3types.h and
    feat.h to decode large .wav files ?

    2-) Is there any tutorial how to use sphinx3_continuous ?

    3-) How to get n-best hypothesis from large lattice files by using
    sphinx3_astar ? (How to increase sphinx3_astar limits?)

     
  • Nickolay V. Shmyrev

    1-) Is there any negative effect of increasing S3_MAX_FRAMES in s3types.h
    and
    feat.h to decode large .wav files ?

    Yes, it requires more memory and can even go out of memory. Whole decoding
    history will
    expand as a tree.

    For that reason decoder architecture often includes endpointer, the
    component that chunks the incoming continuous audio on small utterances.
    Then each utterance is decoded separately. This endpointer could be
    either embedded in decoder or can be standalone preprocessor as cont_ad
    in sphinxbase. Sphinx 3 also has continuous tools like
    sphinx3_continuous.

    2-) Is there any tutorial how to use sphinx3_continuous ?

    No, there is no such tutorial. We don't recommend you to use
    sphinx3_continuous. We recommend our users to use pocketsphinx instead.
    It provides consistent API for all required functionality including
    n-best lists.

    3-) How to get n-best hypothesis from large lattice files by using
    sphinx3_astar
    ? (How to increase sphinx3_astar limits?)

    Sorry, I'm not sure what limits are you talking about.

     
  • Berker Batur

    Berker Batur - 2010-09-28

    3-) How to get n-best hypothesis from large lattice files by using
    sphinx3_astar
    ? (How to increase sphinx3_astar limits?)

    Sorry, I'm not sure what limits are you talking about.

    I decoded a wav file which is length of 2.03 min. Than sphinx generated
    related lattice file which has 15163 nodes in it.
    When I run sphinx3_astar on this lattice file to get 10 best hypothesis, it
    gives the following error:

    INFO: dag.c(1158): Reading DAG file: ../lattice/test.lat.gz
    INFO: dag.c(1314): dag->nfrm+1, 12334
    INFO: main_astar.c(418): 12333 frames, 10876 nodes, 322494 edges, 0 bypass
    INFO: astar.c(676): Writing N-Best list to ../nbest/test.nbest.gz
    ERROR: "astar.c", line 607: Max PPATH limit (1000000) exceeded
    ERROR: "astar.c", line 710: test: A* search failed

    Should I increase this PPATH limit of sphinx3_astar or try something else like
    pruning (beam or absolute).

     
  • Nickolay V. Shmyrev

    Yes, it's maxppath option. You can decrease beams as well.

     

Log in to post a comment.