|
From: Brett S. <bs...@uo...> - 2012-08-09 14:24:19
|
Hello, In my network simulation, I have an OSVC decoder, which works great. However, if I have multiple instances of the class, each of which calls the OSVC library (each running in its own thread, receiving it's own H.264 stream), things start to go wrong. If I disable all but one of the decoders, so that they packets reaching it are identical, it works every time. But whenever there are multiple instances of the decoder, after the first one finishes, the others produce corrupt results, and occasionally one of the decoders produces corrupt results right from the start. I'm linking against OpenSVCDec statically. And each decoder calls SVCDecoder_init with it's own _playerstruct variable. I've tried omitting the call to SVCDecoder_close, but that has no effect. I have verified that the binary input into the decoder is the same in either case, yet it produces different output. The only difference I can see is having multiple instances or not. It seemed to work fine with multiple when I was testing with AVC streams, but once I introduced SVC streams the problems started. Is this something that should work, or should not work? Thank you! |