I am building a DNN consisting of two parallel networks. The input should be split into two parts and fed into the two networks respectively. Could anyone tell me how I can split the input feature matrix into two parts? For example, the input dimension is 100, I want the first 30 dimensions fed into the first network, and the left 70 dimensions fed into the second one.
Also, I need concatenate the output from these two networks and fed it into another network. How can I concatenate the output feature matrix?
Thank you very much!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
This cannot be done, at least not very easily, in the nnet1 or nnet2
code, but will be possible in the nnet3 code, which should be ready in
a couple of months.
Dan
I am building a DNN consisting of two parallel networks. The input should be
split into two parts and fed into the two networks respectively. Could
anyone tell me how I can split the input feature matrix into two parts? For
example, the input dimension is 100, I want the first 30 dimensions fed into
the first network, and the left 70 dimensions fed into the second one.
Also, I need concatenate the output from these two networks and fed it into
another network. How can I concatenate the output feature matrix?
Thank you very much!
question about nnet-forward for tow parallel network
Hi,
this can be done easily with nnet1, there is a Component
'<ParallelComponent>' implemented in
'trunk/src/nnet/nnet-parallel-component.h'.
It is a ``container'' component which contains a vector of neural
networks, while the input/output of the component is a concatenation
of the inputs/outputs of the individual neural networks.
The '<ParallelComponent>' is used for example in the CNN example:
egs/rm/s5/local/nnet/run_cnn.sh,
precisely inside the scripts which generates cnn-prototype
'utils/nnet/make_cnn_proto.py'
(the pitch features there are processed by a fully connected input
without convolution).
Best,
Karel.
On 06/19/2015 05:47 AM, Daniel Povey wrote:
This cannot be done, at least not very easily, in the nnet1 or nnet2
code, but will be possible in the nnet3 code, which should be ready in
a couple of months.
Dan
On Thu, Jun 18, 2015 at 11:45 PM, tfpeach tfpeach@users.sf.net tfpeach@users.sf.net wrote:
Hi,
I am building a DNN consisting of two parallel networks. The input
should be
split into two parts and fed into the two networks respectively. Could
anyone tell me how I can split the input feature matrix into two
parts? For
example, the input dimension is 100, I want the first 30
dimensions fed into
the first network, and the left 70 dimensions fed into the second one.
Also, I need concatenate the output from these two networks and
fed it into
another network. How can I concatenate the output feature matrix?
Thank you very much!
------------------------------------------------------------------------
question about nnet-forward for tow parallel network
------------------------------------------------------------------------
Sent from sourceforge.net because you indicated interest in
https://sourceforge.net/p/kaldi/discussion/1355349/
<https://sourceforge.net/p/kaldi/discussion/1355349>
To unsubscribe from further messages, please visit
https://sourceforge.net/auth/subscriptions/
<https://sourceforge.net/auth/subscriptions>
You don't need to split the input features '<ParallelComponent>' will do
it for you.
K.
On 06/19/2015 01:34 PM, Karel Vesely wrote:
Hi,
this can be done easily with nnet1, there is a Component
'<ParallelComponent>' implemented in
'trunk/src/nnet/nnet-parallel-component.h'.
It is a ``container'' component which contains a vector of neural
networks, while the input/output of the component is a concatenation
of the inputs/outputs of the individual neural networks.
The '<ParallelComponent>' is used for example in the CNN example:
egs/rm/s5/local/nnet/run_cnn.sh,
precisely inside the scripts which generates cnn-prototype
'utils/nnet/make_cnn_proto.py'
(the pitch features there are processed by a fully connected input
without convolution).
Best,
Karel.
On 06/19/2015 05:47 AM, Daniel Povey wrote:
This cannot be done, at least not very easily, in the nnet1 or nnet2
code, but will be possible in the nnet3 code, which should be ready in
a couple of months.
Dan
On Thu, Jun 18, 2015 at 11:45 PM, tfpeach tfpeach@users.sf.net
tfpeach@users.sf.net <mailto:tfpeach@users.sf.net> wrote:
Hi,
I am building a DNN consisting of two parallel networks. The input
should be
split into two parts and fed into the two networks respectively. Could
anyone tell me how I can split the input feature matrix into two
parts? For
example, the input dimension is 100, I want the first 30
dimensions fed into
the first network, and the left 70 dimensions fed into the second one.
Also, I need concatenate the output from these two networks and
fed it into
another network. How can I concatenate the output feature matrix?
Thank you very much!
------------------------------------------------------------------------
question about nnet-forward for tow parallel network
------------------------------------------------------------------------
Sent from sourceforge.net because you indicated interest in
https://sourceforge.net/p/kaldi/discussion/1355349/
<https://sourceforge.net/p/kaldi/discussion/1355349>
To unsubscribe from further messages, please visit
https://sourceforge.net/auth/subscriptions/
<https://sourceforge.net/auth/subscriptions>
------------------------------------------------------------------------
question about nnet-forward for tow parallel network
http://sourceforge.net/p/kaldi/discussion/1355349/thread/ffd567e6/?limit=25#8d95/7db2
------------------------------------------------------------------------
Sent from sourceforge.net because you indicated interest in
https://sourceforge.net/p/kaldi/discussion/1355349/
<https://sourceforge.net/p/kaldi/discussion/1355349>
https://sourceforge.net/p/kaldi/discussion/1355349
To unsubscribe from further messages, please visit
https://sourceforge.net/auth/subscriptions/
<https://sourceforge.net/auth/subscriptions>
https://sourceforge.net/auth/subscriptions
I am using pdnn to train the networks, and have to follow that fashion. So what I have is the three files containing the weights of the three networks. I have checked the one you told me. If I don't mistake, I can use the "nnet-concat" to concatenate the parallel networks, right?
Thank you very much!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hello,
no, in order to put 2 NNs in parallel you'll need to create a
"prototype" file like (nnet1 nnet2 are already trained networks):
<ParallelComponent> <InputDim> 1419 <OutputDim> 2248
<NestedNnetFilename> nnet1 nnet2 </NestedNnetFilename>
And then call the 'nnet-initialize' to build the network.
Alternatively you can do the same while doing random initialization
(proto1 proto2 are NN prototypes):
<ParallelComponent> <InputDim> 1419 <OutputDim> 2248 <NestedNnetProto>
proto1 proto2 </NestedNnetProto>
'nnet-concat' is here to do the serial concatenatation of components,
so it will concatenate the parallel part with the fully connected part
(the layers close to the output).
Best regards,
Karel.
On 06/19/2015 05:40 PM, tfpeach wrote:
Thank you, Karel.
I am using pdnn to train the networks, and have to follow that
fashion. So what I have is the three files containing the weights of
the three networks. I have checked the one you told me. If I don't
mistake, I can use the "nnet-concat" to concatenate the parallel
networks, right?
These day I was trying the method you told me. However, it tells me errors. Could you tell me why it does not work?
Here is the output errors:
nnet-initialize --binary=false nnet_config nnet_out
VLOG[1] (nnet-initialize:Init():nnet-nnet.cc:373) <ParallelComponent> <InputDim> 420 <OutputDim> 1800
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to read token at file position -1
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to read token at file position -1
The content of my nnet_config file is like this:
<ParallelComponent> <InputDim> 420 <OutputDim> 1800
<NestedNnetFilename> dnn1.nnet dnn2.nnet </NestedNnetFilename>
The dnn1.nnet and dnn2.nnet files are the trained network files following the regular format in Kaldi. They have softmax and the activation function is <maxout>. (We modified the kaldi source code a little bit to make it work with <maxout> unit).
These day I was trying the method you told me. However, it tells me errors.
Could you tell me why it does not work?
Here is the output errors:
nnet-initialize --binary=false nnet_config nnet_out
VLOG[1] (nnet-initialize:Init():nnet-nnet.cc:373) <ParallelComponent>
<InputDim> 420 <OutputDim> 1800
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to
read token at file position -1
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to
read token at file position -1
The content of my nnet_config file is like this:
<ParallelComponent> <InputDim> 420 <OutputDim> 1800
<NestedNnetFilename> dnn1.nnet dnn2.nnet </NestedNnetFilename>
The dnn1.nnet and dnn2.nnet files are the trained network files following
the regular format in Kaldi. They have softmax and the activation function
is <maxout>. (We modified the kaldi source code a little bit to make it work
with <maxout> unit).
Sorry to disturb you again. As you mentioned above, nnet-concat can do serial concatenation of two networks. So if I want to get network which has two branches and one trunk, I can do the nnet-initialize to parallely combine the two branches and then use nnet-concat to combine the branches and trunk networks into one file. Is that right?
Thank you very much!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Sorry to disturb you again. As you mentioned above, nnet-concat can do
serial concatenation of two networks. So if I want to get network
which has two branches and one trunk, I can do the nnet-initialize to
parallely combine the two branches and then use nnet-concat to combine
the branches and trunk networks into one file. Is that right?
Previously, I use nnet-forward in nnet2 during the decoding step, here is the command:
finalfeats="$feats nnet-forward --class-frame-counts=$dir/class.counts --apply-log=true --no-softmax=false $srcdir/dnn.nnet ark:- ark:- |"
I checked the egs/rm/s5/steps/decode.sh. It seems I can still use the nnet-forward to do the forward computation in decoding steps (If assume the dnn.nnet is obtained from nnet-initialize and nnet-concat). However, when I use the nnet-forward, it has errors saying that there is something wrong with kaldi-io. I guess this is because I use a architecture other than normal ones. Could you tell me how can I fix this problem, or what codes I should modify?
Thank you.
Last edit: tfpeach 2015-07-07
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
I am building a DNN consisting of two parallel networks. The input should be split into two parts and fed into the two networks respectively. Could anyone tell me how I can split the input feature matrix into two parts? For example, the input dimension is 100, I want the first 30 dimensions fed into the first network, and the left 70 dimensions fed into the second one.
Also, I need concatenate the output from these two networks and fed it into another network. How can I concatenate the output feature matrix?
Thank you very much!
This cannot be done, at least not very easily, in the nnet1 or nnet2
code, but will be possible in the nnet3 code, which should be ready in
a couple of months.
Dan
On Thu, Jun 18, 2015 at 11:45 PM, tfpeach tfpeach@users.sf.net wrote:
Hi,
this can be done easily with nnet1, there is a Component
'<ParallelComponent>' implemented in
'trunk/src/nnet/nnet-parallel-component.h'.
It is a ``container'' component which contains a vector of neural
networks, while the input/output of the component is a concatenation
of the inputs/outputs of the individual neural networks.
The '<ParallelComponent>' is used for example in the CNN example:
egs/rm/s5/local/nnet/run_cnn.sh,
precisely inside the scripts which generates cnn-prototype
'utils/nnet/make_cnn_proto.py'
(the pitch features there are processed by a fully connected input
without convolution).
Best,
Karel.
On 06/19/2015 05:47 AM, Daniel Povey wrote:
You don't need to split the input features '<ParallelComponent>' will do
it for you.
K.
On 06/19/2015 01:34 PM, Karel Vesely wrote:
Thank you, Karel.
I am using pdnn to train the networks, and have to follow that fashion. So what I have is the three files containing the weights of the three networks. I have checked the one you told me. If I don't mistake, I can use the "nnet-concat" to concatenate the parallel networks, right?
Thank you very much!
Hello,
no, in order to put 2 NNs in parallel you'll need to create a
"prototype" file like (nnet1 nnet2 are already trained networks):
<ParallelComponent> <InputDim> 1419 <OutputDim> 2248
<NestedNnetFilename> nnet1 nnet2 </NestedNnetFilename>
And then call the 'nnet-initialize' to build the network.
Alternatively you can do the same while doing random initialization
(proto1 proto2 are NN prototypes):
<ParallelComponent> <InputDim> 1419 <OutputDim> 2248 <NestedNnetProto>
proto1 proto2 </NestedNnetProto>
'nnet-concat' is here to do the serial concatenatation of components,
so it will concatenate the parallel part with the fully connected part
(the layers close to the output).
Best regards,
Karel.
On 06/19/2015 05:40 PM, tfpeach wrote:
Hi, Karel,
These day I was trying the method you told me. However, it tells me errors. Could you tell me why it does not work?
Here is the output errors:
nnet-initialize --binary=false nnet_config nnet_out
VLOG[1] (nnet-initialize:Init():nnet-nnet.cc:373) <ParallelComponent> <InputDim> 420 <OutputDim> 1800
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to read token at file position -1
ERROR (nnet-initialize:ReadToken():io-funcs.cc:155) ReadToken, failed to read token at file position -1
The content of my nnet_config file is like this:
<ParallelComponent> <InputDim> 420 <OutputDim> 1800
<NestedNnetFilename> dnn1.nnet dnn2.nnet </NestedNnetFilename>
The dnn1.nnet and dnn2.nnet files are the trained network files following the regular format in Kaldi. They have softmax and the activation function is <maxout>. (We modified the kaldi source code a little bit to make it work with <maxout> unit).
You can check the details in the files in the link (they are too large to attach here):
https://drive.google.com/folderview?id=0B8tSafcov_e8flVicGk3QklxeWtCRDdyR0R6dlZwN3ptLWstQWsyeG9SWl9sV2YtTkFpVHM&usp=sharing
Thank you.
You should at least run in gdb (gdb --args [program] [args]), do
"catch throw" and "run", and get a backtrace from where the ERROR is
printed.
Dan
On Thu, Jun 25, 2015 at 12:13 PM, tfpeach tfpeach@users.sf.net wrote:
Hi,
you need to put whole <ParallelComponent> description to single line in
the config file (the NN prototype format is 1 component = 1 line):
<ParallelComponent> <InputDim> 420 <OutputDim> 1800 <NestedNnetFilename>
dnn1.nnet dnn2.nnet </NestedNnetFilename>
Best!
Karel.
Dne 25. 6. 2015 v 20:07 Daniel Povey napsal(a):
Thank you. It works!
Hi, Karel,
Sorry to disturb you again. As you mentioned above, nnet-concat can do serial concatenation of two networks. So if I want to get network which has two branches and one trunk, I can do the nnet-initialize to parallely combine the two branches and then use nnet-concat to combine the branches and trunk networks into one file. Is that right?
Thank you very much!
Hi, Yes it should work like that.
K.
Dne 4. 7. 2015 v 19:56 tfpeach napsal(a):
Thank you. Sorry to bother again.
Previously, I use nnet-forward in nnet2 during the decoding step, here is the command:
finalfeats="$feats nnet-forward --class-frame-counts=$dir/class.counts --apply-log=true --no-softmax=false $srcdir/dnn.nnet ark:- ark:- |"
I checked the egs/rm/s5/steps/decode.sh. It seems I can still use the nnet-forward to do the forward computation in decoding steps (If assume the dnn.nnet is obtained from nnet-initialize and nnet-concat). However, when I use the nnet-forward, it has errors saying that there is something wrong with kaldi-io. I guess this is because I use a architecture other than normal ones. Could you tell me how can I fix this problem, or what codes I should modify?
Thank you.
Last edit: tfpeach 2015-07-07