Re: [Dclib-devel] [dclib-devel] question about batch normalisation
Brought to you by:
davisking
|
From: Eloi Du B. <elo...@gm...> - 2017-12-03 01:32:50
|
Ok sorry, affine for bn_con and multiply for dropouts is what I was looking for. Thanks, 2017-12-02 19:29 GMT-06:00 Eloi Du Bois <elo...@gm...>: > Mmmh, I see thanks. > > Is there a copy helper function to copy two net that might have different > architecture ? Like bn_con on the training net and affine on the prod net > or dropouts and dropouts removed on the production net? I guess a solution > is to use direct access to the right layers and copy them one by one, but > this is a bit of a pain. > > Thanks > > 2017-12-02 19:24 GMT-06:00 Davis King <dav...@gm...>: > >> It's about making the network do what you want. You have to think about >> what you want to do. Do you want to do batch normalization all the time? >> Maybe you do, but in most cases that isn't what you want when you are >> really using a model because it does something weird to the data that only >> really makes sense during training. >> >> ------------------------------------------------------------ >> ------------------ >> Check out the vibrant tech community on one of the world's most >> engaging tech sites, Slashdot.org! http://sdm.link/slashdot >> _______________________________________________ >> Dclib-devel mailing list >> Dcl...@li... >> https://lists.sourceforge.net/lists/listinfo/dclib-devel >> >> > |