|
From: Daniel P. <dp...@gm...> - 2014-11-11 20:45:41
|
Probably the option --use-silphones was removed from the script but remains in the calling example. Just remove the option for now and run from the step that failed. Karel, you might want to check in a fix for this. Dan On Tue, Nov 11, 2014 at 3:40 PM, Ming Tu <tum...@gm...> wrote: > Hi everyone, > > I have another error when running to Karel's script. The log is: > > ============================================================================ > DNN Hybrid Training & Decoding (Karel's recipe) > ============================================================================ > steps/nnet/make_fmllr_feats.sh --nj 10 --cmd run.pl --transform-dir > exp/tri3/decode_test data-fmllr-tri3/test data/test exp/tri3 > data-fmllr-tri3/test/log data-fmllr-tri3/test/data > steps/nnet/make_fmllr_feats.sh: feature type is lda_fmllr > steps/nnet/make_fmllr_feats.sh: Done!, type lda_fmllr, data/test --> > data-fmllr-tri3/test, using : raw-trans None, gmm exp/tri3, trans > exp/tri3/decode_test > steps/nnet/make_fmllr_feats.sh --nj 10 --cmd run.pl --transform-dir > exp/tri3/decode_dev data-fmllr-tri3/dev data/dev exp/tri3 > data-fmllr-tri3/dev/log data-fmllr-tri3/dev/data > steps/nnet/make_fmllr_feats.sh: feature type is lda_fmllr > steps/nnet/make_fmllr_feats.sh: Done!, type lda_fmllr, data/dev --> > data-fmllr-tri3/dev, using : raw-trans None, gmm exp/tri3, trans > exp/tri3/decode_dev > steps/nnet/make_fmllr_feats.sh --nj 10 --cmd run.pl --transform-dir > exp/tri3_ali data-fmllr-tri3/train data/train exp/tri3 > data-fmllr-tri3/train/log data-fmllr-tri3/train/data > steps/nnet/make_fmllr_feats.sh: feature type is lda_fmllr > steps/nnet/make_fmllr_feats.sh: Done!, type lda_fmllr, data/train --> > data-fmllr-tri3/train, using : raw-trans None, gmm exp/tri3, trans > exp/tri3_ali > utils/subset_data_dir_tr_cv.sh data-fmllr-tri3/train > data-fmllr-tri3/train_tr90 data-fmllr-tri3/train_cv10 > /home/ming/kaldi-trunk/egs/timit/s5/utils/subset_data_dir.sh: reducing #utt > from 3696 to 3320 > /home/ming/kaldi-trunk/egs/timit/s5/utils/subset_data_dir.sh: reducing #utt > from 3696 to 376 > exp/dnn4_pretrain-dbn/log/rbm.3.log:progress: [6.69102 4.79589 4.59248 > 4.52796 4.49425 4.47364 4.45425 4.43291 4.41321 4.39089 4.37391 4.36182 > 4.33871 4.32305 4.30841 4.29437 4.27835 4.26338 4.24451 4.23261 4.22594 > 4.22015 ] > exp/dnn4_pretrain-dbn/log/rbm.4.log:progress: [5.08679 3.75421 3.60121 > 3.55219 3.52727 3.51 3.49744 3.48335 3.47008 3.45369 3.44215 3.43454 3.42081 > 3.40939 3.40078 3.39252 3.38437 3.3727 3.35988 3.35331 3.35172 3.34959 ] > exp/dnn4_pretrain-dbn/log/rbm.5.log:progress: [4.87049 3.32658 3.14669 > 3.10413 3.08638 3.0761 3.0662 3.05728 3.04658 3.03451 3.02772 3.02178 > 3.01145 3.00338 2.99626 2.99111 2.98376 2.97654 2.96586 2.96138 2.96174 > 2.95995 ] > exp/dnn4_pretrain-dbn/log/rbm.6.log:progress: [3.65318 2.68876 2.56733 > 2.52673 2.50761 2.49697 2.49113 2.4853 2.47522 2.46409 2.45972 2.45737 > 2.44803 2.4431 2.43837 2.43415 2.42968 2.42373 2.41349 2.41221 2.41144 > 2.41134 ] > > Pre-training finished. > Removing features tmpdir /tmp/tmp.XQL9AHFHQJ @ ming > train.ark > # Accounting: time=5612 threads=1 > # Ended (code 0) at Tue Nov 11 05:18:32 MST 2014, elapsed time 5612 seconds > # steps/nnet/pretrain_dbn.sh --hid-dim 1024 --rbm-iter 20 > data-fmllr-tri3/train exp/dnn4_pretrain-dbn > # Started at Tue Nov 11 11:52:50 MST 2014 > # > steps/nnet/pretrain_dbn.sh --hid-dim 1024 --rbm-iter 20 > data-fmllr-tri3/train exp/dnn4_pretrain-dbn > # INFO > steps/nnet/pretrain_dbn.sh : Pre-training Deep Belief Network as a stack of > RBMs > dir : exp/dnn4_pretrain-dbn > Train-set : data-fmllr-tri3/train > steps/nnet/pretrain_dbn.sh Skipping, already have > exp/dnn4_pretrain-dbn/6.dbn > # Accounting: time=0 threads=1 > # Ended (code 0) at Tue Nov 11 11:52:50 MST 2014, elapsed time 0 seconds > ITERATION 13: TRAIN AVG.LOSS 0.7196, (lrate1.5625e-05), CROSSVAL AVG.LOSS > 1.4502, nnet accepted > (nnet_6.dbn_dnn_iter13_learnrate1.5625e-05_tr0.7196_cv1.4502) > finished, too small rel. improvement .0006753171 > Succeeded training the Neural Network : exp/dnn4_pretrain-dbn_dnn/final.nnet > Preparing feature transform with CNN layers for RBM pre-training. > steps/nnet/train.sh successfuly finished.. exp/dnn4_pretrain-dbn_dnn > Removing features tmpdir /tmp/tmp.rUdewHleyG @ ming > cv.ark > train.ark > # Accounting: time=1015 threads=1 > # Ended (code 0) at Tue Nov 11 05:35:27 MST 2014, elapsed time 1015 seconds > # steps/nnet/train.sh --feature-transform > exp/dnn4_pretrain-dbn/final.feature_transform --dbn > exp/dnn4_pretrain-dbn/6.dbn --hid-layers 0 --learn-rate 0.008 > data-fmllr-tri3/train_tr90 data-fmllr-tri3/train_cv10 data/lang exp/tri3_ali > exp/tri3_ali exp/dnn4_pretrain-dbn_dnn > # Started at Tue Nov 11 11:52:50 MST 2014 > # > steps/nnet/train.sh --feature-transform > exp/dnn4_pretrain-dbn/final.feature_transform --dbn > exp/dnn4_pretrain-dbn/6.dbn --hid-layers 0 --learn-rate 0.008 > data-fmllr-tri3/train_tr90 data-fmllr-tri3/train_cv10 data/lang exp/tri3_ali > exp/tri3_ali exp/dnn4_pretrain-dbn_dnn > > # INFO > steps/nnet/train.sh : Training Neural Network > dir : exp/dnn4_pretrain-dbn_dnn > Train-set : data-fmllr-tri3/train_tr90 exp/tri3_ali > CV-set : data-fmllr-tri3/train_cv10 exp/tri3_ali > > SKIPPING TRAINING... (steps/nnet/train.sh) > nnet already trained : exp/dnn4_pretrain-dbn_dnn/final.nnet > (nnet/nnet_6.dbn_dnn_iter13_learnrate1.5625e-05_tr0.7196_cv1.4502_final_) > > # Accounting: time=0 threads=1 > # Ended (code 0) at Tue Nov 11 11:52:50 MST 2014, elapsed time 0 seconds > steps/nnet/decode.sh --nj 20 --cmd run.pl --acwt 0.2 exp/tri3/graph > data-fmllr-tri3/test exp/dnn4_pretrain-dbn_dnn/decode_test > Warning: run.pl ignoring options "-pe smp 2 " > steps/nnet/decode.sh --nj 20 --cmd run.pl --acwt 0.2 exp/tri3/graph > data-fmllr-tri3/dev exp/dnn4_pretrain-dbn_dnn/decode_dev > Warning: run.pl ignoring options "-pe smp 2 " > steps/nnet/align.sh --nj 20 --cmd run.pl data-fmllr-tri3/train data/lang > exp/dnn4_pretrain-dbn_dnn exp/dnn4_pretrain-dbn_dnn_ali > steps/nnet/align.sh: aligning data 'data-fmllr-tri3/train' using nnet/model > 'exp/dnn4_pretrain-dbn_dnn', putting alignments in > 'exp/dnn4_pretrain-dbn_dnn_ali' > steps/nnet/align.sh: done aligning data. > steps/nnet/make_denlats.sh --nj 20 --cmd run.pl --acwt 0.2 --lattice-beam > 10.0 --beam 18.0 data-fmllr-tri3/train data/lang exp/dnn4_pretrain-dbn_dnn > exp/dnn4_pretrain-dbn_dnn_denlats > Making unigram grammar FST in exp/dnn4_pretrain-dbn_dnn_denlats/lang > Compiling decoding graph in exp/dnn4_pretrain-dbn_dnn_denlats/dengraph > Graph exp/dnn4_pretrain-dbn_dnn_denlats/dengraph/HCLG.fst already exists: > skipping graph creation. > steps/nnet/make_denlats.sh: generating denlats from data > 'data-fmllr-tri3/train', putting lattices in > 'exp/dnn4_pretrain-dbn_dnn_denlats' > Warning: run.pl ignoring options "-pe smp 2 " > steps/nnet/make_denlats.sh: done generating denominator lattices. > steps/nnet/train_mpe.sh --cmd run.pl --num-iters 6 --acwt 0.2 --do-smbr true > --use-silphones true data-fmllr-tri3/train data/lang > exp/dnn4_pretrain-dbn_dnn exp/dnn4_pretrain-dbn_dnn_ali > exp/dnn4_pretrain-dbn_dnn_denlats exp/dnn4_pretrain-dbn_dnn_smbr > steps/nnet/train_mpe.sh: invalid option --use-silphones > > Then the script just exited without exiting error. I checked train_mpe.sh > and found nothing about the option "--use-silphones". > > Now gcc is 4.9.1 because of ubuntu's update. I changed host_config.h of CUDA > driver so there is no error of "Unsupported gcc". Thanks. > > -- > Graduate Research Assistant > Signal Analysis, Representation and Perception Lab > Arizona State University, USA > > ------------------------------------------------------------------------------ > Comprehensive Server Monitoring with Site24x7. > Monitor 10 servers for $9/Month. > Get alerted through email, SMS, voice calls or mobile push notifications. > Take corrective actions from your mobile device. > http://pubads.g.doubleclick.net/gampad/clk?id=154624111&iu=/4140/ostg.clktrk > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |