From: Kirill K. <kir...@sm...> - 2015-06-27 00:57:55
|
> From: Mate Andre [mailto:ele...@gm...] > Sent: 2015-06-25 0809 > To: kal...@li... > > 1. How long does it take to train on all (960hrs) of Librispeech on a > GPU (say GTX TITAN X or K6000)? Even a rough estimate could be useful. Changing the estimate. I am training a model on the full 960 hour set now, and just passed 980 iterations of total 7040 in 24 hours on a single GTX980 GPU. Since the Titan X may be 10 to 30% faster, depending on task, you are looking at about 6 days of computation. If you do not need to squeeze every last percent of performance out of it from the first run, train on 460 hours, it's twice as fast, and leave it churning the rest of data while working on the rest of your problem. Training takes very little resources outside of the GPU and 1 CPU core. I initially played with different transformations and augmentation of raw data on a 100 hour set, to understand where it is all going. This can be trained in a day. YMMV. -kkm |