That is not supported in the binaries currently because it probably
won't give you very much speedup- only half the time is taken in
matrix operations, so the potential speedup is <50%, possibly much
less. There are other threads on this topic- search on google.
Hello everyone!
How can I use my GPU card for decoding or generate lattices in nnet2 setup?
Thanks in advance.
That is not supported in the binaries currently because it probably
won't give you very much speedup- only half the time is taken in
matrix operations, so the potential speedup is <50%, possibly much
less. There are other threads on this topic- search on google.
Dan
On Tue, Jun 9, 2015 at 4:53 AM, JTDamaja jtdamaja@users.sf.net wrote:
Thank you for reply.