Hey Nickolay V. Shmyrev, thanks for your help, i just need a source for my thesis where i describe the reason why i moved from PocketSphinx to Sphinx4. Right now iam just justifying it by quoting this line "...significant reduction in memory consumption" and saying: "because of lack of memory, PocketSphinx can't handle large vocabulary as good as Sphinx4"
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
"because of lack of memory, PocketSphinx can't handle large vocabulary as good as Sphinx4"
This is wrong. Both decoders are about the same. Reduction of memory consumption means there is more memory available, not less.
You can safely use both pocketsphinx and sphinx4 on a server for LVCSR. For best accuracy it is better to use more modern toolkit Kaldi, but it is harder to setup.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
so why is it then recommanded to run it on a server for realtime transcription with large vocabulary instead running it offline on an Android smartphone?
Would pocketshinx on Android and Sphinx4 on Computer have the same result when used with large vocabulary? if not, what is the reason for that.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
hiho,
i read here, that its recommended to use a server for realtime transcription with large vocabulary. Is there an official statement regarding this?
Send me(nshmyrev@gmail.com) your post address, I'll send you an official confirmation on a company blank with a stamp.
Last edit: Nickolay V. Shmyrev 2017-01-30
Hey Nickolay V. Shmyrev, thanks for your help, i just need a source for my thesis where i describe the reason why i moved from PocketSphinx to Sphinx4. Right now iam just justifying it by quoting this line "...significant reduction in memory consumption" and saying: "because of lack of memory, PocketSphinx can't handle large vocabulary as good as Sphinx4"
This is wrong. Both decoders are about the same. Reduction of memory consumption means there is more memory available, not less.
You can safely use both pocketsphinx and sphinx4 on a server for LVCSR. For best accuracy it is better to use more modern toolkit Kaldi, but it is harder to setup.
so why is it then recommanded to run it on a server for realtime transcription with large vocabulary instead running it offline on an Android smartphone?
Would pocketshinx on Android and Sphinx4 on Computer have the same result when used with large vocabulary? if not, what is the reason for that.
Android phones are too slow for realtime large vocabulary decoding with sphinx4. Only Google has that technology.
ah thank you, is there a source for your statement?
Last edit: Peter Mueller 2017-01-30