Menu

Fail to create new dictionary using g2p-seq2seq

Help
Askarya QS
2017-03-06
2017-03-07
  • Askarya QS

    Askarya QS - 2017-03-06

    i could to build new phonetic dictionary using g2p-seq2seq
    i have some trouble with training my dict
    before it, i build new dict with languange bahasa indonesia
    just 5 words will determine, the command of robotic move in bahasa indonesia
    example : maju, mundur, kiri, kanan, berhenti
    im bulid phones with 34 phones, determine symblos and vp in languange bahasa indonesia
    after that im run i have some trouble like this

    Preparing G2P data
    Creating vocabulary ind_model/vocab.phoneme
    Creating vocabulary ind_model/vocab.grapheme
    Reading development and training data.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE3 instructions, but these are available on your machine and could speed up CPU computations.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
    W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
    Creating 2 layers of 64 units.
    Created model with fresh parameters.
    global step 200 learning rate 0.5000 step-time 0.44 perplexity 1.64
    eval: perplexity 1.00
    global step 400 learning rate 0.5000 step-time 0.36 perplexity 1.15
    eval: perplexity 1.00
    global step 600 learning rate 0.5000 step-time 0.43 perplexity 1.01
    eval: perplexity 1.00
    global step 800 learning rate 0.5000 step-time 0.41 perplexity 1.00
    eval: perplexity 1.00
    global step 1000 learning rate 0.5000 step-time 0.44 perplexity 1.00
    eval: perplexity 1.00
    global step 1200 learning rate 0.5000 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 1400 learning rate 0.5000 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 1600 learning rate 0.5000 step-time 0.52 perplexity 1.00
    eval: perplexity 1.00
    global step 1800 learning rate 0.5000 step-time 0.44 perplexity 1.00
    eval: perplexity 1.00
    global step 2000 learning rate 0.5000 step-time 0.52 perplexity 1.00
    eval: perplexity 1.00
    global step 2200 learning rate 0.5000 step-time 0.41 perplexity 1.00
    eval: perplexity 1.00
    global step 2400 learning rate 0.5000 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 2600 learning rate 0.5000 step-time 0.49 perplexity 1.00
    eval: perplexity 1.00
    global step 2800 learning rate 0.5000 step-time 0.46 perplexity 1.00
    eval: perplexity 1.00
    global step 3000 learning rate 0.5000 step-time 0.43 perplexity 1.00
    eval: perplexity 1.00
    global step 3200 learning rate 0.5000 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 3400 learning rate 0.5000 step-time 0.53 perplexity 1.00
    eval: perplexity 1.00
    global step 3600 learning rate 0.5000 step-time 0.47 perplexity 1.00
    eval: perplexity 1.00
    global step 3800 learning rate 0.5000 step-time 0.45 perplexity 1.00
    eval: perplexity 1.00
    global step 4000 learning rate 0.5000 step-time 0.47 perplexity 1.00
    eval: perplexity 1.00
    global step 4200 learning rate 0.5000 step-time 0.51 perplexity 1.00
    eval: perplexity 1.00
    global step 4400 learning rate 0.5000 step-time 0.44 perplexity 1.00
    eval: perplexity 1.00
    global step 4600 learning rate 0.5000 step-time 0.35 perplexity 1.00
    eval: perplexity 1.00
    global step 4800 learning rate 0.5000 step-time 0.53 perplexity 1.00
    eval: perplexity 1.00
    global step 5000 learning rate 0.5000 step-time 0.49 perplexity 1.00
    eval: perplexity 1.00
    global step 5200 learning rate 0.5000 step-time 0.48 perplexity 1.00
    eval: perplexity 1.00
    global step 5400 learning rate 0.5000 step-time 0.44 perplexity 1.00
    eval: perplexity 1.00
    global step 5600 learning rate 0.5000 step-time 0.45 perplexity 1.00
    eval: perplexity 1.00
    global step 5800 learning rate 0.5000 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 6000 learning rate 0.4950 step-time 0.42 perplexity 1.00
    eval: perplexity 1.00
    global step 6200 learning rate 0.4950 step-time 0.49 perplexity 1.00
    eval: perplexity 1.00
    global step 6400 learning rate 0.4950 step-time 0.43 perplexity 1.00
    eval: perplexity 1.00
    global step 6600 learning rate 0.4950 step-time 0.48 perplexity 1.00
    eval: perplexity 1.00
    global step 6800 learning rate 0.4950 step-time 0.44 perplexity 1.00
    eval: perplexity 1.00
    global step 7000 learning rate 0.4950 step-time 0.47 perplexity 1.00
    eval: perplexity 1.00
    global step 7200 learning rate 0.4950 step-time 0.41 perplexity 1.00
    eval: perplexity 1.00
    Training done.
    Traceback (most recent call last):
    File "/usr/local/bin/g2p-seq2seq", line 9, in <module>
    load_entry_point('g2p-seq2seq==5.0.0a0', 'console_scripts', 'g2p-seq2seq')()
    File "build/bdist.linux-x86_64/egg/g2p_seq2seq/app.py", line 79, in main
    File "build/bdist.linux-x86_64/egg/g2p_seq2seq/g2p.py", line 267, in train
    File "build/bdist.linux-x86_64/egg/g2p_seq2seq/g2p.py", line 76, in load_decode_model
    RuntimeError: Model not found in ind_model</module>

    will you help me to fix that? thank you

     
  • Askarya QS

    Askarya QS - 2017-03-06

    that index perplexity not changed, just stopped in 1.00, what the meaning of that "perplexity"?

     
    • Nickolay V. Shmyrev

      what the meaning of that "perplexity"?

      https://en.wikipedia.org/wiki/Perplexity

       
      • Askarya QS

        Askarya QS - 2017-03-06

        wheter the training data in .dict that i used is too less? i'm just have 5 words

         

        Last edit: Askarya QS 2017-03-06
        • Arseniy Gorin

          Arseniy Gorin - 2017-03-06

          you need around 100k words to train this kind of models. for your task you do not need G2P - just do hand-crafted dictionary

           
          • Askarya QS

            Askarya QS - 2017-03-07

            thanks for the answer, perhaps i will build around 100k words, that so help to my projects

             

Log in to post a comment.