IVMRDeinterlaceControl9 reports a deinterlace mode with PixelAdaptive technology. I'm using SetDeinterlaceMode(0, pixelAdaptMode) to set this mode on the stream with index 0 and it returns 0 (OK). When checking the value with GetDeinterlaceMode it's has been set correctly. But when checking with GetActualDeinterlaceMode it returns Guid.Empty. Why isn't the set mode actually used?
I read this at msdn: "GetActualDeinterlaceMode returns GUID_NULL if the VMR has not initialized the deinterlacing hardware, or if the VMR determines that this stream should not be deinterlaced."
If the stream clearly is interlaced and the VMR still "determines that this stream should not be deinterlaced", can I force it to deinterlace anyway?
"The SetDeinterlaceMode method is effective only for new connections made to the VMR."
I'm almost sure that you must configure the disinterlace mode prior connecting the VMR9 pins.
The order of my code is like this:
I have tried changing order in all kinds of ways but the result is always the same (no de-interlacing)
You might want to try a deinterlace filter instead.
What is the value of the formatType member of the media type of your VMR9 input type ?
This should be FormatType.VideoInfo2. This is the only format type that support interlaced video.
But does that really mean that deinterlacing can't be activated? I mean the stream is still interlaced even though I can't get info about exactly what kind of interlacing is used...
But what is exactly limiting me here? Is it my graphics card? Is it the capture device? Is it the source video signal?
FormatType.VideoInfo only handle non-interlaced video which explain why the VMR9 don't disinterlace anything.
Fellow your graph upstream to see which filter (if any) convert a FormatType.VideoInfo2 media type into a FormatType.VideoInfo media type.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.