|
From: Mailing l. u. f. U. C. a. U. <kal...@li...> - 2013-06-27 21:30:03
|
In my setup there is RBM pre-training: http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf <http://www.cs.toronto.edu/%7Ehinton/absps/guideTR.pdf> followed by per-frame cross entropy training and sMBR training: http://www.danielpovey.com/files/2013_interspeech_dnn.pdf Dne 27.6.2013 13:21, Mailing list used for User Communication and Updates napsal(a): > There are basically two setups there: Karel's setup, generally called > run_dnn.sh or run_nnet.sh, which is for GPUs, and my setup, called > run_nnet_cpu.sh, which is for CPUs in parallel. Karel's setup may > have an ICASSP paper, Karel can tell you. Mine is mostly unpublished. > > Dan > > > On Thu, Jun 27, 2013 at 5:31 AM, Mailing list used for User > Communication and Updates <kal...@li...> wrote: >> Hi All, >> >> I am in the process of running the wsj/s5 recipe. Now I am about the run DNN >> experiments and specifically interested in the DNN training. I am planning >> to look into the DNN code for more understanding. Since there are many DNN >> variants, could anyone tell me the papers Kalid DNN implementation >> represents? >> >> Thanks, >> Lahiru >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Kaldi-users mailing list >> Kal...@li... >> https://lists.sourceforge.net/lists/listinfo/kaldi-users >> > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Kaldi-users mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-users |