Hi All,
Can anyone share your experience about below online natural gradient configuration for relu network?
alpha= rank-in= rank-out= num-samples-history= update-period= max-change-per-sample=
thanks, Yan
I haven't looked into tuning that particular setup. You could just use all-defaults. It's possible that the rank-in might profit from being increased a little from the default of 30: you could try 80. May not make much difference though. Dan
On Wed, Jul 15, 2015 at 10:49 AM, Yan Yin riyijiye1976@users.sf.net wrote:
Hi All, Can anyone share your experience about below online natural gradient configuration for relu network? alpha= rank-in= rank-out= num-samples-history= update-period= max-change-per-sample= thanks, Yan online natural gradient setup for relu network Sent from sourceforge.net because you indicated interest in https://sourceforge.net/p/kaldi/discussion/1355348/ To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/
online natural gradient setup for relu network
Sent from sourceforge.net because you indicated interest in https://sourceforge.net/p/kaldi/discussion/1355348/
To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/
Hi All,
Can anyone share your experience about below online natural gradient configuration for relu network?
alpha= rank-in= rank-out= num-samples-history= update-period= max-change-per-sample=
thanks,
Yan
I haven't looked into tuning that particular setup. You could just
use all-defaults. It's possible that the rank-in might profit from
being increased a little from the default of 30: you could try 80.
May not make much difference though.
Dan
On Wed, Jul 15, 2015 at 10:49 AM, Yan Yin riyijiye1976@users.sf.net wrote: