It seems to me that in principle one can write a finite state grammar in (say)
bi-gram fashion. One can create a bigram LM, and then just edit the
probabilities to reflect FSG logic.
For example, let the FSG be like
hello (alpha | beta ) how are you?
corresponding bigram LM will have only following probabilities non-zero
(disable backoff)
Hello,
It seems to me that in principle one can write a finite state grammar in (say)
bi-gram fashion. One can create a bigram LM, and then just edit the
probabilities to reflect FSG logic.
For example, let the FSG be like
corresponding bigram LM will have only following probabilities non-zero
(disable backoff)
P(alpha | hello)
P(alpha | hello)
P(alpha | hello)
P(beta | hello)
P(beta | hello)
P(beta | hello)
P(how | alpha)
P(how | beta)
P(are | how)
P(you | are)
P(|you)
I am not sure if this can be generalized though (counter-examples please!). If
it can be, then we can use n-gram code which can give nbest list for FSG
(currently unsupported feature, see https://sourceforge.net/projects/cmusphin
x/forums/forum/5471/topic/4473108).
Nevermind, Found counter-example
<sentence> = w1 w2 w1 (w2 | w3) w2 w1 (w2 | w3 | w4) </sentence>
here one needs a 4-gram representation to convert FSG -> n-gram. So the method
may work only for some special case FSGs (like in prev post).
The easiest way to support n-best is actually to implement n-best.
Hi,
It has been some time since I looked at it but the nBest code for FSG already
exists (http://sourceforge.net/projects/cmusphinx/forums/forum/5471/topic/475
8690)
Regards,
The code exists but it does not work properly. It doesn't take fsg
probabilities into account, only the acoustic scores.