|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-27 09:32:05
|
Hi All, I am in the process of running the wsj/s5 recipe. Now I am about the run DNN experiments and specifically interested in the DNN training. I am planning to look into the DNN code for more understanding. Since there are many DNN variants, could anyone tell me the papers Kalid DNN implementation represents? Thanks, Lahiru |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-27 17:21:31
|
There are basically two setups there: Karel's setup, generally called run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. Dan On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > Hi All, > > I am in the process of running the wsj/s5 recipe. Now I am about the run DNN > experiments and specifically interested in the DNN training. I am planning > to look into the DNN code for more understanding. Since there are many DNN > variants, could anyone tell me the papers Kalid DNN implementation > represents? > > Thanks, > Lahiru > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-27 21:30:03
|
In my setup there is RBM pre-training: http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> followed by per-frame cross entropy training and sMBR training: http://www.danielpovey.com/files/2013_interspeech_dnn.pdf Dne 27.6.2013 13:21, Mailing list used for User Communication and Updates napsal(a): > There are basically two setups there: Karel's setup, generally called > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may > have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. > > Dan > > > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User > Communication and Updates <kal...@li...> wrote: >> Hi All, >> >> I am in the process of running the wsj/s5 recipe. Now I am about the run DNN >> experiments and specifically interested in the DNN training. I am planning >> to look into the DNN code for more understanding. Since there are many DNN >> variants, could anyone tell me the papers Kalid DNN implementation >> represents? >> >> Thanks, >> Lahiru >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Kaldi-users mailing list >> Kal...@li... >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-28 06:05:10
|
Thanks guys :-) Dan, is your setup for distributed training? Or is it only parallelize with in a single machine? Thanks, Lahiru On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > In my setup there is RBM pre-training: > http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf > <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> > followed by per-frame cross entropy training and sMBR training: > http://www.danielpovey.com/files/2013_interspeech_dnn.pdf > > > Dne 27.6.2013 13:21, Mailing list used for User Communication and > Updates napsal(a): > > There are basically two setups there: Karel's setup, generally called > > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called > > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may > > have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. > > > > Dan > > > > > > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User > > Communication and Updates <kal...@li...> wrote: > >> Hi All, > >> > >> I am in the process of running the wsj/s5 recipe. Now I am about the > run DNN > >> experiments and specifically interested in the DNN training. I am > planning > >> to look into the DNN code for more understanding. Since there are many > DNN > >> variants, could anyone tell me the papers Kalid DNN implementation > >> represents? > >> > >> Thanks, > >> Lahiru > >> > >> > ------------------------------------------------------------------------------ > >> This SF.net email is sponsored by Windows: > >> > >> Build for Windows Store. > >> > >> http://p.sf.net/sfu/windows-dev2dev > >> _______________________________________________ > >> Kaldi-users mailing list > >> Kal...@li... > >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> > > > ------------------------------------------------------------------------------ > > This SF.net email is sponsored by Windows: > > > > Build for Windows Store. > > > > http://p.sf.net/sfu/windows-dev2dev > > _______________________________________________ > > Kaldi-users mailing list > > Kal...@li... > > https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-28 14:28:48
|
It's on multiple machines and also multiple threads per machine. Dan On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > Thanks guys :-) > > Dan, is your setup for distributed training? Or is it only parallelize with > in a single machine? > > Thanks, > Lahiru > > > > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User Communication > and Updates <kal...@li...> wrote: >> >> In my setup there is RBM pre-training: >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> >> followed by per-frame cross entropy training and sMBR training: >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf >> >> >> Dne 27.6.2013 13:21, Mailing list used for User Communication and >> Updates napsal(a): >> > There are basically two setups there: Karel's setup, generally called >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may >> > have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. >> > >> > Dan >> > >> > >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User >> > Communication and Updates <kal...@li...> wrote: >> >> Hi All, >> >> >> >> I am in the process of running the wsj/s5 recipe. Now I am about the >> >> run DNN >> >> experiments and specifically interested in the DNN training. I am >> >> planning >> >> to look into the DNN code for more understanding. Since there are many >> >> DNN >> >> variants, could anyone tell me the papers Kalid DNN implementation >> >> represents? >> >> >> >> Thanks, >> >> Lahiru >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> This SF.net email is sponsored by Windows: >> >> >> >> Build for Windows Store. >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> _______________________________________________ >> >> Kaldi-users mailing list >> >> Kal...@li... >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> > >> > ------------------------------------------------------------------------------ >> > This SF.net email is sponsored by Windows: >> > >> > Build for Windows Store. >> > >> > http://p.sf.net/sfu/windows-dev2dev >> > _______________________________________________ >> > Kaldi-users mailing list >> > Kal...@li... >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Kaldi-users mailing list >> Kal...@li... >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-28 15:04:06
|
Wow, nice. Does the implementation similar to the Jeff Dean's paper Large Scale Distributed Deep Networks ( http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf) ? Does Kaldi use Asynchronous SGD? Please give me a brief description. Thanks, Lahiru On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > It's on multiple machines and also multiple threads per machine. > Dan > > > On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User > Communication and Updates <kal...@li...> wrote: > > Thanks guys :-) > > > > Dan, is your setup for distributed training? Or is it only parallelize > with > > in a single machine? > > > > Thanks, > > Lahiru > > > > > > > > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User Communication > > and Updates <kal...@li...> wrote: > >> > >> In my setup there is RBM pre-training: > >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf > >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> > >> followed by per-frame cross entropy training and sMBR training: > >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf > >> > >> > >> Dne 27.6.2013 13:21, Mailing list used for User Communication and > >> Updates napsal(a): > >> > There are basically two setups there: Karel's setup, generally called > >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called > >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may > >> > have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. > >> > > >> > Dan > >> > > >> > > >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User > >> > Communication and Updates <kal...@li...> wrote: > >> >> Hi All, > >> >> > >> >> I am in the process of running the wsj/s5 recipe. Now I am about the > >> >> run DNN > >> >> experiments and specifically interested in the DNN training. I am > >> >> planning > >> >> to look into the DNN code for more understanding. Since there are > many > >> >> DNN > >> >> variants, could anyone tell me the papers Kalid DNN implementation > >> >> represents? > >> >> > >> >> Thanks, > >> >> Lahiru > >> >> > >> >> > >> >> > ------------------------------------------------------------------------------ > >> >> This SF.net email is sponsored by Windows: > >> >> > >> >> Build for Windows Store. > >> >> > >> >> http://p.sf.net/sfu/windows-dev2dev > >> >> _______________________________________________ > >> >> Kaldi-users mailing list > >> >> Kal...@li... > >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> >> > >> > > >> > > ------------------------------------------------------------------------------ > >> > This SF.net email is sponsored by Windows: > >> > > >> > Build for Windows Store. > >> > > >> > http://p.sf.net/sfu/windows-dev2dev > >> > _______________________________________________ > >> > Kaldi-users mailing list > >> > Kal...@li... > >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> > >> > >> > >> > ------------------------------------------------------------------------------ > >> This SF.net email is sponsored by Windows: > >> > >> Build for Windows Store. > >> > >> http://p.sf.net/sfu/windows-dev2dev > >> _______________________________________________ > >> Kaldi-users mailing list > >> Kal...@li... > >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net email is sponsored by Windows: > > > > Build for Windows Store. > > > > http://p.sf.net/sfu/windows-dev2dev > > _______________________________________________ > > Kaldi-users mailing list > > Kal...@li... > > https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-28 15:06:49
|
It's not the same as that. Each machine does SGD separately and, periodically, the parameters are averaged across machines. Dan On Fri, Jun 28, 2013 at 11:03 AM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > Wow, nice. > Does the implementation similar to the Jeff Dean's paper Large Scale > Distributed Deep Networks > (http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf) > ? > Does Kaldi use Asynchronous SGD? > > Please give me a brief description. > > Thanks, > Lahiru > > > On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User Communication > and Updates <kal...@li...> wrote: >> >> It's on multiple machines and also multiple threads per machine. >> Dan >> >> >> On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User >> Communication and Updates <kal...@li...> wrote: >> > Thanks guys :-) >> > >> > Dan, is your setup for distributed training? Or is it only parallelize >> > with >> > in a single machine? >> > >> > Thanks, >> > Lahiru >> > >> > >> > >> > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User >> > Communication >> > and Updates <kal...@li...> wrote: >> >> >> >> In my setup there is RBM pre-training: >> >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf >> >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> >> >> followed by per-frame cross entropy training and sMBR training: >> >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf >> >> >> >> >> >> Dne 27.6.2013 13:21, Mailing list used for User Communication and >> >> Updates napsal(a): >> >> > There are basically two setups there: Karel's setup, generally called >> >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called >> >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may >> >> > have an ICASSP paper, Karel can tell you. Mine is mostly >> >> > unpublished. >> >> > >> >> > Dan >> >> > >> >> > >> >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User >> >> > Communication and Updates <kal...@li...> wrote: >> >> >> Hi All, >> >> >> >> >> >> I am in the process of running the wsj/s5 recipe. Now I am about the >> >> >> run DNN >> >> >> experiments and specifically interested in the DNN training. I am >> >> >> planning >> >> >> to look into the DNN code for more understanding. Since there are >> >> >> many >> >> >> DNN >> >> >> variants, could anyone tell me the papers Kalid DNN implementation >> >> >> represents? >> >> >> >> >> >> Thanks, >> >> >> Lahiru >> >> >> >> >> >> >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> >> This SF.net email is sponsored by Windows: >> >> >> >> >> >> Build for Windows Store. >> >> >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> >> _______________________________________________ >> >> >> Kaldi-users mailing list >> >> >> Kal...@li... >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> >> >> > >> >> > >> >> > ------------------------------------------------------------------------------ >> >> > This SF.net email is sponsored by Windows: >> >> > >> >> > Build for Windows Store. >> >> > >> >> > http://p.sf.net/sfu/windows-dev2dev >> >> > _______________________________________________ >> >> > Kaldi-users mailing list >> >> > Kal...@li... >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> >> >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> This SF.net email is sponsored by Windows: >> >> >> >> Build for Windows Store. >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> _______________________________________________ >> >> Kaldi-users mailing list >> >> Kal...@li... >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > >> > >> > >> > >> > ------------------------------------------------------------------------------ >> > This SF.net email is sponsored by Windows: >> > >> > Build for Windows Store. >> > >> > http://p.sf.net/sfu/windows-dev2dev >> > _______________________________________________ >> > Kaldi-users mailing list >> > Kal...@li... >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > >> >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Kaldi-users mailing list >> Kal...@li... >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-07-02 13:10:56
|
Hi All,
When running DNN training on GPUs, I am getting following error.
*Log File : exp/tri4b_pretrain-dbn/_pretrain_dbn.log*
*# PRE-TRAINING RBM LAYER 1
Initializing 'exp/tri4b_pretrain-dbn/1.rbm.init'
Traceback (most recent call last):
File "utils/nnet/gen_rbm_init.py", line 40, in ?
dimL.append(int(dimStrL[i]))
ValueError: invalid literal for int(): *
I am running this in a GPU cluster which assigns the job to a GPU
dynamically, So I cannot configure the *_gpu_id= # manually select GPU id
to run on, (-1 disables GPU)*.
Can this be the cause?
Thanks,
Lahiru
On Fri, Jun 28, 2013 at 11:06 PM, Mailing list used for User Communication
and Updates <kal...@li...> wrote:
> It's not the same as that. Each machine does SGD separately and,
> periodically, the parameters are averaged across machines.
> Dan
>
>
> On Fri, Jun 28, 2013 at 11:03 AM, Mailing list used for User
> Communication and Updates <kal...@li...> wrote:
> > Wow, nice.
> > Does the implementation similar to the Jeff Dean's paper Large Scale
> > Distributed Deep Networks
> > (
> http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf
> )
> > ?
> > Does Kaldi use Asynchronous SGD?
> >
> > Please give me a brief description.
> >
> > Thanks,
> > Lahiru
> >
> >
> > On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User
> Communication
> > and Updates <kal...@li...> wrote:
> >>
> >> It's on multiple machines and also multiple threads per machine.
> >> Dan
> >>
> >>
> >> On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User
> >> Communication and Updates <kal...@li...> wrote:
> >> > Thanks guys :-)
> >> >
> >> > Dan, is your setup for distributed training? Or is it only parallelize
> >> > with
> >> > in a single machine?
> >> >
> >> > Thanks,
> >> > Lahiru
> >> >
> >> >
> >> >
> >> > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User
> >> > Communication
> >> > and Updates <kal...@li...> wrote:
> >> >>
> >> >> In my setup there is RBM pre-training:
> >> >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf
> >> >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf>
> >> >> followed by per-frame cross entropy training and sMBR training:
> >> >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf
> >> >>
> >> >>
> >> >> Dne 27.6.2013 13:21, Mailing list used for User Communication and
> >> >> Updates napsal(a):
> >> >> > There are basically two setups there: Karel's setup, generally
> called
> >> >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called
> >> >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may
> >> >> > have an ICASSP paper, Karel can tell you. Mine is mostly
> >> >> > unpublished.
> >> >> >
> >> >> > Dan
> >> >> >
> >> >> >
> >> >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User
> >> >> > Communication and Updates <kal...@li...>
> wrote:
> >> >> >> Hi All,
> >> >> >>
> >> >> >> I am in the process of running the wsj/s5 recipe. Now I am about
> the
> >> >> >> run DNN
> >> >> >> experiments and specifically interested in the DNN training. I am
> >> >> >> planning
> >> >> >> to look into the DNN code for more understanding. Since there are
> >> >> >> many
> >> >> >> DNN
> >> >> >> variants, could anyone tell me the papers Kalid DNN implementation
> >> >> >> represents?
> >> >> >>
> >> >> >> Thanks,
> >> >> >> Lahiru
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >>
> ------------------------------------------------------------------------------
> >> >> >> This SF.net email is sponsored by Windows:
> >> >> >>
> >> >> >> Build for Windows Store.
> >> >> >>
> >> >> >> http://p.sf.net/sfu/windows-dev2dev
> >> >> >> _______________________________________________
> >> >> >> Kaldi-users mailing list
> >> >> >> Kal...@li...
> >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >> >> >>
> >> >> >
> >> >> >
> >> >> >
> ------------------------------------------------------------------------------
> >> >> > This SF.net email is sponsored by Windows:
> >> >> >
> >> >> > Build for Windows Store.
> >> >> >
> >> >> > http://p.sf.net/sfu/windows-dev2dev
> >> >> > _______________________________________________
> >> >> > Kaldi-users mailing list
> >> >> > Kal...@li...
> >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >> >>
> >> >>
> >> >>
> >> >>
> >> >>
> ------------------------------------------------------------------------------
> >> >> This SF.net email is sponsored by Windows:
> >> >>
> >> >> Build for Windows Store.
> >> >>
> >> >> http://p.sf.net/sfu/windows-dev2dev
> >> >> _______________________________________________
> >> >> Kaldi-users mailing list
> >> >> Kal...@li...
> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >> >
> >> >
> >> >
> >> >
> >> >
> ------------------------------------------------------------------------------
> >> > This SF.net email is sponsored by Windows:
> >> >
> >> > Build for Windows Store.
> >> >
> >> > http://p.sf.net/sfu/windows-dev2dev
> >> > _______________________________________________
> >> > Kaldi-users mailing list
> >> > Kal...@li...
> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >> >
> >>
> >>
> >>
> ------------------------------------------------------------------------------
> >> This SF.net email is sponsored by Windows:
> >>
> >> Build for Windows Store.
> >>
> >> http://p.sf.net/sfu/windows-dev2dev
> >> _______________________________________________
> >> Kaldi-users mailing list
> >> Kal...@li...
> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >
> >
> >
> >
> ------------------------------------------------------------------------------
> > This SF.net email is sponsored by Windows:
> >
> > Build for Windows Store.
> >
> > http://p.sf.net/sfu/windows-dev2dev
> > _______________________________________________
> > Kaldi-users mailing list
> > Kal...@li...
> > https://lists.sourceforge.net/lists/listinfo/kaldi-users
> >
>
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by Windows:
>
> Build for Windows Store.
>
> http://p.sf.net/sfu/windows-dev2dev
> _______________________________________________
> Kaldi-users mailing list
> Kal...@li...
> https://lists.sourceforge.net/lists/listinfo/kaldi-users
>
|
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-07-02 13:22:50
|
Sorry, I was wrong. It selects the GPU automatically, I found the error in *exp/tri4b_pretrain-dbn/log/cmvn_glob_fwd.log *file. ERROR (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) --class-frame-counts is empty: Cannot initialize priors without the counts. ERROR (nnet-forward:main():nnet-forward.cc:196) ERROR (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) --class-frame-counts is empty: Cannot initialize priors without the counts. Thanks Lahiru On Tue, Jul 2, 2013 at 9:10 PM, Lahiru Samarakoon <lah...@gm...>wrote: > Hi All, > > When running DNN training on GPUs, I am getting following error. > > *Log File : exp/tri4b_pretrain-dbn/_pretrain_dbn.log* > > *# PRE-TRAINING RBM LAYER 1 > Initializing 'exp/tri4b_pretrain-dbn/1.rbm.init' > Traceback (most recent call last): > File "utils/nnet/gen_rbm_init.py", line 40, in ? > dimL.append(int(dimStrL[i])) > ValueError: invalid literal for int(): * > > > I am running this in a GPU cluster which assigns the job to a GPU > dynamically, So I cannot configure the *_gpu_id= # manually select GPU id > to run on, (-1 disables GPU)*. > Can this be the cause? > > Thanks, > Lahiru > > > On Fri, Jun 28, 2013 at 11:06 PM, Mailing list used for User Communication > and Updates <kal...@li...> wrote: > >> It's not the same as that. Each machine does SGD separately and, >> periodically, the parameters are averaged across machines. >> Dan >> >> >> On Fri, Jun 28, 2013 at 11:03 AM, Mailing list used for User >> Communication and Updates <kal...@li...> wrote: >> > Wow, nice. >> > Does the implementation similar to the Jeff Dean's paper Large Scale >> > Distributed Deep Networks >> > ( >> http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf >> ) >> > ? >> > Does Kaldi use Asynchronous SGD? >> > >> > Please give me a brief description. >> > >> > Thanks, >> > Lahiru >> > >> > >> > On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User >> Communication >> > and Updates <kal...@li...> wrote: >> >> >> >> It's on multiple machines and also multiple threads per machine. >> >> Dan >> >> >> >> >> >> On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User >> >> Communication and Updates <kal...@li...> wrote: >> >> > Thanks guys :-) >> >> > >> >> > Dan, is your setup for distributed training? Or is it only >> parallelize >> >> > with >> >> > in a single machine? >> >> > >> >> > Thanks, >> >> > Lahiru >> >> > >> >> > >> >> > >> >> > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User >> >> > Communication >> >> > and Updates <kal...@li...> wrote: >> >> >> >> >> >> In my setup there is RBM pre-training: >> >> >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf >> >> >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> >> >> >> followed by per-frame cross entropy training and sMBR training: >> >> >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf >> >> >> >> >> >> >> >> >> Dne 27.6.2013 13:21, Mailing list used for User Communication and >> >> >> Updates napsal(a): >> >> >> > There are basically two setups there: Karel's setup, generally >> called >> >> >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called >> >> >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may >> >> >> > have an ICASSP paper, Karel can tell you. Mine is mostly >> >> >> > unpublished. >> >> >> > >> >> >> > Dan >> >> >> > >> >> >> > >> >> >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User >> >> >> > Communication and Updates <kal...@li...> >> wrote: >> >> >> >> Hi All, >> >> >> >> >> >> >> >> I am in the process of running the wsj/s5 recipe. Now I am about >> the >> >> >> >> run DNN >> >> >> >> experiments and specifically interested in the DNN training. I >> am >> >> >> >> planning >> >> >> >> to look into the DNN code for more understanding. Since there are >> >> >> >> many >> >> >> >> DNN >> >> >> >> variants, could anyone tell me the papers Kalid DNN >> implementation >> >> >> >> represents? >> >> >> >> >> >> >> >> Thanks, >> >> >> >> Lahiru >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> >> >> This SF.net email is sponsored by Windows: >> >> >> >> >> >> >> >> Build for Windows Store. >> >> >> >> >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> >> >> _______________________________________________ >> >> >> >> Kaldi-users mailing list >> >> >> >> Kal...@li... >> >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> >> >> >> >> > >> >> >> > >> >> >> > >> ------------------------------------------------------------------------------ >> >> >> > This SF.net email is sponsored by Windows: >> >> >> > >> >> >> > Build for Windows Store. >> >> >> > >> >> >> > http://p.sf.net/sfu/windows-dev2dev >> >> >> > _______________________________________________ >> >> >> > Kaldi-users mailing list >> >> >> > Kal...@li... >> >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> >> This SF.net email is sponsored by Windows: >> >> >> >> >> >> Build for Windows Store. >> >> >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> >> _______________________________________________ >> >> >> Kaldi-users mailing list >> >> >> Kal...@li... >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> > >> >> > >> >> > >> >> > >> >> > >> ------------------------------------------------------------------------------ >> >> > This SF.net email is sponsored by Windows: >> >> > >> >> > Build for Windows Store. >> >> > >> >> > http://p.sf.net/sfu/windows-dev2dev >> >> > _______________________________________________ >> >> > Kaldi-users mailing list >> >> > Kal...@li... >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> >> > >> >> >> >> >> >> >> ------------------------------------------------------------------------------ >> >> This SF.net email is sponsored by Windows: >> >> >> >> Build for Windows Store. >> >> >> >> http://p.sf.net/sfu/windows-dev2dev >> >> _______________________________________________ >> >> Kaldi-users mailing list >> >> Kal...@li... >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > >> > >> > >> > >> ------------------------------------------------------------------------------ >> > This SF.net email is sponsored by Windows: >> > >> > Build for Windows Store. >> > >> > http://p.sf.net/sfu/windows-dev2dev >> > _______________________________________________ >> > Kaldi-users mailing list >> > Kal...@li... >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > >> >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Kaldi-users mailing list >> Kal...@li... >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > > |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-07-02 13:38:24
|
Hi Lahiru, I already fixed this issue in the trunk, the PdfPrior is now activated only when the option --class-frame-counts is present. Karel Dne 2.7.2013 9:22, Mailing list used for User Communication and Updates napsal(a): > Sorry, I was wrong. It selects the GPU automatically, > > I found the error in *exp/tri4b_pretrain-dbn/log/cmvn_glob_fwd.log *file. > > ERROR (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) > --class-frame-counts is empty: Cannot initialize priors without the > counts. > ERROR (nnet-forward:main():nnet-forward.cc:196) ERROR > (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) --class-frame-counts is > empty: Cannot initialize priors without the counts. > > > Thanks > Lahiru > > > On Tue, Jul 2, 2013 at 9:10 PM, Lahiru Samarakoon <lah...@gm... > <mailto:lah...@gm...>> wrote: > > Hi All, > > When running DNN training on GPUs, I am getting following error. > > _*Log File : exp/tri4b_pretrain-dbn/_pretrain_dbn.log*_ > > /# PRE-TRAINING RBM LAYER 1 > Initializing 'exp/tri4b_pretrain-dbn/1.rbm.init' > Traceback (most recent call last): > File "*utils/nnet/gen_rbm_init.py*", line 40, in ? > dimL.append(int(dimStrL[i])) > *ValueError: invalid literal for int(): */ > > > I am running this in a GPU cluster which assigns the job to a GPU > dynamically, So I cannot configure the *_gpu_id= # manually select > GPU id to run on, (-1 disables GPU)*. > Can this be the cause? > > Thanks, > Lahiru > > > On Fri, Jun 28, 2013 at 11:06 PM, Mailing list used for User > Communication and Updates <kal...@li... > <mailto:kal...@li...>> wrote: > > It's not the same as that. Each machine does SGD separately and, > periodically, the parameters are averaged across machines. > Dan > > > On Fri, Jun 28, 2013 at 11:03 AM, Mailing list used for User > Communication and Updates <kal...@li... > <mailto:kal...@li...>> wrote: > > Wow, nice. > > Does the implementation similar to the Jeff Dean's paper > Large Scale > > Distributed Deep Networks > > > (http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf > <http://www.cs.toronto.edu/%7Eranzato/publications/DistBeliefNIPS2012_withAppendix.pdf>) > > ? > > Does Kaldi use Asynchronous SGD? > > > > Please give me a brief description. > > > > Thanks, > > Lahiru > > > > > > On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User > Communication > > and Updates <kal...@li... > <mailto:kal...@li...>> wrote: > >> > >> It's on multiple machines and also multiple threads per > machine. > >> Dan > >> > >> > >> On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User > >> Communication and Updates > <kal...@li... > <mailto:kal...@li...>> wrote: > >> > Thanks guys :-) > >> > > >> > Dan, is your setup for distributed training? Or is it > only parallelize > >> > with > >> > in a single machine? > >> > > >> > Thanks, > >> > Lahiru > >> > > >> > > >> > > >> > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User > >> > Communication > >> > and Updates <kal...@li... > <mailto:kal...@li...>> wrote: > >> >> > >> >> In my setup there is RBM pre-training: > >> >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf > <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> > >> >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> > >> >> followed by per-frame cross entropy training and sMBR > training: > >> >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf > >> >> > >> >> > >> >> Dne 27.6.2013 13:21, Mailing list used for User > Communication and > >> >> Updates napsal(a): > >> >> > There are basically two setups there: Karel's setup, > generally called > >> >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my > setup, called > >> >> > run_nnet_cpu.sh, which is for CPUs in parallel. > Karel's setup may > >> >> > have an ICASSP paper, Karel can tell you. Mine is mostly > >> >> > unpublished. > >> >> > > >> >> > Dan > >> >> > > >> >> > > >> >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for > User > >> >> > Communication and Updates > <kal...@li... > <mailto:kal...@li...>> wrote: > >> >> >> Hi All, > >> >> >> > >> >> >> I am in the process of running the wsj/s5 recipe. Now > I am about the > >> >> >> run DNN > >> >> >> experiments and specifically interested in the DNN > training. I am > >> >> >> planning > >> >> >> to look into the DNN code for more understanding. > Since there are > >> >> >> many > >> >> >> DNN > >> >> >> variants, could anyone tell me the papers Kalid DNN > implementation > >> >> >> represents? > >> >> >> > >> >> >> Thanks, > >> >> >> Lahiru > >> >> >> > >> >> >> > >> >> >> > >> >> >> > ------------------------------------------------------------------------------ > >> >> >> This SF.net email is sponsored by Windows: > >> >> >> > >> >> >> Build for Windows Store. > >> >> >> > >> >> >> http://p.sf.net/sfu/windows-dev2dev > >> >> >> _______________________________________________ > >> >> >> Kaldi-users mailing list > >> >> >> Kal...@li... > <mailto:Kal...@li...> > >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> >> >> > >> >> > > >> >> > > >> >> > > ------------------------------------------------------------------------------ > >> >> > This SF.net email is sponsored by Windows: > >> >> > > >> >> > Build for Windows Store. > >> >> > > >> >> > http://p.sf.net/sfu/windows-dev2dev > >> >> > _______________________________________________ > >> >> > Kaldi-users mailing list > >> >> > Kal...@li... > <mailto:Kal...@li...> > >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> >> > >> >> > >> >> > >> >> > >> >> > ------------------------------------------------------------------------------ > >> >> This SF.net email is sponsored by Windows: > >> >> > >> >> Build for Windows Store. > >> >> > >> >> http://p.sf.net/sfu/windows-dev2dev > >> >> _______________________________________________ > >> >> Kaldi-users mailing list > >> >> Kal...@li... > <mailto:Kal...@li...> > >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> > > >> > > >> > > >> > > >> > > ------------------------------------------------------------------------------ > >> > This SF.net email is sponsored by Windows: > >> > > >> > Build for Windows Store. > >> > > >> > http://p.sf.net/sfu/windows-dev2dev > >> > _______________________________________________ > >> > Kaldi-users mailing list > >> > Kal...@li... > <mailto:Kal...@li...> > >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users > >> > > >> > >> > >> > ------------------------------------------------------------------------------ > >> This SF.net email is sponsored by Windows: > >> > >> Build for Windows Store. > >> > >> http://p.sf.net/sfu/windows-dev2dev > >> _______________________________________________ > >> Kaldi-users mailing list > >> Kal...@li... > <mailto:Kal...@li...> > >> https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net email is sponsored by Windows: > > > > Build for Windows Store. > > > > http://p.sf.net/sfu/windows-dev2dev > > _______________________________________________ > > Kaldi-users mailing list > > Kal...@li... > <mailto:Kal...@li...> > > https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > <mailto:Kal...@li...> > https://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > > > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users |
|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-07-02 13:51:45
|
Thanks. On Tue, Jul 2, 2013 at 9:38 PM, Mailing list used for User Communication and Updates <kal...@li...> wrote: > Hi Lahiru, > I already fixed this issue in the trunk, the PdfPrior is now activated > only when the option --class-frame-counts is present. > > Karel > > > Dne 2.7.2013 9:22, Mailing list used for User Communication and Updates > napsal(a): > > Sorry, I was wrong. It selects the GPU automatically, > > I found the error in *exp/tri4b_pretrain-dbn/log/cmvn_glob_fwd.log *file. > > ERROR (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) --class-frame-counts > is empty: Cannot initialize priors without the counts. > ERROR (nnet-forward:main():nnet-forward.cc:196) ERROR > (nnet-forward:PdfPrior():nnet-pdf-prior.cc:26) --class-frame-counts is > empty: Cannot initialize priors without the counts. > > > Thanks > Lahiru > > > On Tue, Jul 2, 2013 at 9:10 PM, Lahiru Samarakoon <lah...@gm...>wrote: > >> Hi All, >> >> When running DNN training on GPUs, I am getting following error. >> >> *Log File : exp/tri4b_pretrain-dbn/_pretrain_dbn.log* >> >> *# PRE-TRAINING RBM LAYER 1 >> Initializing 'exp/tri4b_pretrain-dbn/1.rbm.init' >> Traceback (most recent call last): >> File "utils/nnet/gen_rbm_init.py", line 40, in ? >> dimL.append(int(dimStrL[i])) >> ValueError: invalid literal for int(): * >> >> >> I am running this in a GPU cluster which assigns the job to a GPU >> dynamically, So I cannot configure the *_gpu_id= # manually select GPU >> id to run on, (-1 disables GPU)*. >> Can this be the cause? >> >> Thanks, >> Lahiru >> >> >> On Fri, Jun 28, 2013 at 11:06 PM, Mailing list used for User >> Communication and Updates <kal...@li...> wrote: >> >>> It's not the same as that. Each machine does SGD separately and, >>> periodically, the parameters are averaged across machines. >>> Dan >>> >>> >>> On Fri, Jun 28, 2013 at 11:03 AM, Mailing list used for User >>> Communication and Updates <kal...@li...> wrote: >>> > Wow, nice. >>> > Does the implementation similar to the Jeff Dean's paper Large Scale >>> > Distributed Deep Networks >>> > ( >>> http://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf >>> ) >>> > ? >>> > Does Kaldi use Asynchronous SGD? >>> > >>> > Please give me a brief description. >>> > >>> > Thanks, >>> > Lahiru >>> > >>> > >>> > On Fri, Jun 28, 2013 at 10:28 PM, Mailing list used for User >>> Communication >>> > and Updates <kal...@li...> wrote: >>> >> >>> >> It's on multiple machines and also multiple threads per machine. >>> >> Dan >>> >> >>> >> >>> >> On Fri, Jun 28, 2013 at 2:05 AM, Mailing list used for User >>> >> Communication and Updates <kal...@li...> wrote: >>> >> > Thanks guys :-) >>> >> > >>> >> > Dan, is your setup for distributed training? Or is it only >>> parallelize >>> >> > with >>> >> > in a single machine? >>> >> > >>> >> > Thanks, >>> >> > Lahiru >>> >> > >>> >> > >>> >> > >>> >> > On Fri, Jun 28, 2013 at 5:29 AM, Mailing list used for User >>> >> > Communication >>> >> > and Updates <kal...@li...> wrote: >>> >> >> >>> >> >> In my setup there is RBM pre-training: >>> >> >> http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf >>> >> >> <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> >>> >> >> followed by per-frame cross entropy training and sMBR training: >>> >> >> http://www.danielpovey.com/files/2013_interspeech_dnn.pdf >>> >> >> >>> >> >> >>> >> >> Dne 27.6.2013 13:21, Mailing list used for User Communication and >>> >> >> Updates napsal(a): >>> >> >> > There are basically two setups there: Karel's setup, generally >>> called >>> >> >> > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, >>> called >>> >> >> > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup >>> may >>> >> >> > have an ICASSP paper, Karel can tell you. Mine is mostly >>> >> >> > unpublished. >>> >> >> > >>> >> >> > Dan >>> >> >> > >>> >> >> > >>> >> >> > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User >>> >> >> > Communication and Updates <kal...@li...> >>> wrote: >>> >> >> >> Hi All, >>> >> >> >> >>> >> >> >> I am in the process of running the wsj/s5 recipe. Now I am >>> about the >>> >> >> >> run DNN >>> >> >> >> experiments and specifically interested in the DNN training. I >>> am >>> >> >> >> planning >>> >> >> >> to look into the DNN code for more understanding. Since there >>> are >>> >> >> >> many >>> >> >> >> DNN >>> >> >> >> variants, could anyone tell me the papers Kalid DNN >>> implementation >>> >> >> >> represents? >>> >> >> >> >>> >> >> >> Thanks, >>> >> >> >> Lahiru >>> >> >> >> >>> >> >> >> >>> >> >> >> >>> >> >> >> >>> ------------------------------------------------------------------------------ >>> >> >> >> This SF.net email is sponsored by Windows: >>> >> >> >> >>> >> >> >> Build for Windows Store. >>> >> >> >> >>> >> >> >> http://p.sf.net/sfu/windows-dev2dev >>> >> >> >> _______________________________________________ >>> >> >> >> Kaldi-users mailing list >>> >> >> >> Kal...@li... >>> >> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> >> >> >> >>> >> >> > >>> >> >> > >>> >> >> > >>> ------------------------------------------------------------------------------ >>> >> >> > This SF.net email is sponsored by Windows: >>> >> >> > >>> >> >> > Build for Windows Store. >>> >> >> > >>> >> >> > http://p.sf.net/sfu/windows-dev2dev >>> >> >> > _______________________________________________ >>> >> >> > Kaldi-users mailing list >>> >> >> > Kal...@li... >>> >> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> >> >> >>> >> >> >>> >> >> >>> >> >> >>> >> >> >>> ------------------------------------------------------------------------------ >>> >> >> This SF.net email is sponsored by Windows: >>> >> >> >>> >> >> Build for Windows Store. >>> >> >> >>> >> >> http://p.sf.net/sfu/windows-dev2dev >>> >> >> _______________________________________________ >>> >> >> Kaldi-users mailing list >>> >> >> Kal...@li... >>> >> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> >> > >>> >> > >>> >> > >>> >> > >>> >> > >>> ------------------------------------------------------------------------------ >>> >> > This SF.net email is sponsored by Windows: >>> >> > >>> >> > Build for Windows Store. >>> >> > >>> >> > http://p.sf.net/sfu/windows-dev2dev >>> >> > _______________________________________________ >>> >> > Kaldi-users mailing list >>> >> > Kal...@li... >>> >> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> >> > >>> >> >>> >> >>> >> >>> ------------------------------------------------------------------------------ >>> >> This SF.net email is sponsored by Windows: >>> >> >>> >> Build for Windows Store. >>> >> >>> >> http://p.sf.net/sfu/windows-dev2dev >>> >> _______________________________________________ >>> >> Kaldi-users mailing list >>> >> Kal...@li... >>> >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> > >>> > >>> > >>> > >>> ------------------------------------------------------------------------------ >>> > This SF.net email is sponsored by Windows: >>> > >>> > Build for Windows Store. >>> > >>> > http://p.sf.net/sfu/windows-dev2dev >>> > _______________________________________________ >>> > Kaldi-users mailing list >>> > Kal...@li... >>> > https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> > >>> >>> >>> ------------------------------------------------------------------------------ >>> This SF.net email is sponsored by Windows: >>> >>> Build for Windows Store. >>> >>> http://p.sf.net/sfu/windows-dev2dev >>> _______________________________________________ >>> Kaldi-users mailing list >>> Kal...@li... >>> https://lists.sourceforge.net/lists/listinfo/kaldi-users >>> >> >> > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > http://p.sf.net/sfu/windows-dev2dev > > > > _______________________________________________ > Kaldi-users mailing lis...@li...://lists.sourceforge.net/lists/listinfo/kaldi-users > > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users > > |