From: Tony R. <to...@ca...> - 2015-02-28 19:48:29
|
Just to check that it's RAM you run out of not local disk. IIRC there was a change made to use /tmp quite a few months ago. Our large local disk isn't on /tmp and once we fixed this it all worked great again. It good to see the tedium recipe being at the core of delelopment. There are so many bright people out there, a completely free and state of the art recipe can only get more people working on ASR so we'll make after progress. Tony Sent from my iPad > On 28 Feb 2015, at 19:31, "Raymond W. M. Ng" <wm...@sh...> wrote: > > Hi Kaldi, > > I am training an DNN with Karel setupt on a 160hr data set. > When I get to the sMBR sequence discriminative training (steps/nnet/train_mpe.sh) The memory usage exploded. The program only managed to process around 2/7 of the training files before it crashes. > > There's no easy accumulation function for the DNN but I assume I can just put different training file splits in consecutive iterations? > > I'd like to know if there's resource out there already. I was referring to the egs/tedlium recipe. > > thanks > raymond > ------------------------------------------------------------------------------ > Dive into the World of Parallel Programming The Go Parallel Website, sponsored > by Intel and developed in partnership with Slashdot Media, is your hub for all > things parallel software development, from weekly thought leadership blogs to > news, videos, case studies, tutorials and more. Take a look and join the > conversation now. http://goparallel.sourceforge.net/ > _______________________________________________ > Kaldi-developers mailing list > Kal...@li... > https://lists.sourceforge.net/lists/listinfo/kaldi-developers |