Re: [Opalvoip-devel] Mixing streams using DirectShow
Brought to you by:
csoutheren,
rjongbloed
From: Robert J. <ro...@vo...> - 2012-02-22 01:33:27
|
Hello Alex, Good to see you had it working. Here are my comments: 0) If that queue is filling up, then that means that it is not being removed fast enough. I cannot imagine why. 1) Setting "Digital, 64" does not set it to 64kbps, but 64*64kbps or 4Mbps. Also, you do not need to alter the library for this, you can set that parameter via the "Q931-Bearer-Caps" string option. 2) This is tricky. You cannot change the code in the way you have. The extendedVideoCapability is what indicates an H.239 capability. You cannot just make it use the non H.239 one. That will almost certainly break H.239. Can you try adding to H323Connection::GetMediaFormats() the following: #if OPAL_H239 if (GetMediaStream(OpalMediaType::Video(), true) != NULL) list += GetRemoteH239Formats(); #endif 3) Unfortunately we cannot just change the CustomMaxFS as you describe as it would break SIP. I have checked in a change splitting the Sip/SDP and H.323/H.241 versions of these parameters to accommodate the different multipliers. 4) The asynchPacing.Delay(10) should only be triggered if both media streams are asynchronous. If you are talking about the RTP to mixer video direction, then the OpalRTPMediaStream side should be synchronous and the delay should not be triggered. I am mystified as to why it for you. 5) Lip sync is actually quite a difficult problem. And is unlikely to be fixed any time soon. 6) Of course DirectShow is possible, but again, not simple. I did try a couple of years ago and the very poor documentation for DirectShow, unless you are doing one of their expected scenarios, meant I had to abandon the work as it was not going to be a quick win. L BTW, What version or OPAL where you using? Robert Jongbloed OPAL/OpenH323/PTLib Architect and Co-founder. From: alex parpauta [mailto:ale...@ho...] Sent: Wednesday, 22 February 2012 2:33 AM To: opa...@li... Subject: [Opalvoip-devel] Mixing streams using DirectShow Hi. Let me start by saying that i'm new to Opal and i've only tinkered with it for a few weeks. I'm very impressed by the design and overall quality of the library. My goal is to use Opal to set up a 4-point audio/video conference environment, having an opal-based MCU acting as a server interconnecting the clients. I've successfully used the opalmcu sample to set up a conference and add two members (two Aver systems supporting HD720p). After setting the framerate (30) and size (HD720) in the MyMixerNodeInfo mixer it seems to be running - i have video and audio. I had a few problems : - the opalmcu sample crashed with OOM exception due to too many frames pushed in in OpalVideoMixer::VideoStream::QueuePacket; i've added the following code as a momentary workaround : void OpalVideoMixer::VideoStream::QueuePacket(const RTP_DataFrame & rtp) { static int i = 0; if (m_queue.size() > MAX_SIZE) { m_queue.pop(); } m_queue.push(rtp); } - to make the remote equipment send HD, i had to do : 1) set the Q931 bearer capabilities to "Digital, 64", in H323Connection::SendSignalSetup, thus allowing for an InformationTransferRate of 64 kbit/s @ 64 rate muliplier being sent to the remote equipment during the H225 setup call. 2) the remote equipment sent two distinct H264 video capabilities on the TerminalCapabilityCall exchange - one generic and one extended. The generic supported HD maximum while the extended supported 4SIF maximum. The extended capability was selected by Opal. Upon reviewing the code i found out that the extended capability selection was enforced by the code in H323Connection::GetRemoteH239Formats : if (capability.GetMainType() == H323Capability::e_Video && capability.GetSubType() == H245_VideoCapability::e_extendedVideoCapability) formats += capability.GetMediaFormat(); so I changed it to select the generic one : if (capability.GetMainType() == H323Capability::e_Video && capability.GetSubType() == H245_VideoCapability::e_genericVideoCapability) formats += capability.GetMediaFormat(); - to make Opal send HD, i had to modify slightly the code in h264-x264.cxx - MyPluginMediaFormat::ToNormalised, where the CustomMaxFS and CustomMaxMBPS values were checked against the maximum levels defined in the levels table. However, the check was done without first multiplying with the appropriate value - 256 for CustomMaxFS and 500 for CustomMaxMBPS, so they were in fact always set to the level's maximum, not to the intended value. For example, the Aver equipment send Level=2.2, CustomMaxFS = 16 and CustomMaxMBPS = 216. The code in ToNormalised checked for the maximum between Levels[2.2].MaxFS and 16 while it should have checked between Levels[2.2].MaxFS and 16 * 256. At least this is my understanding of the H264 meaning for CustomMaxFS and CustomMaxMBPS. The old code was : unsigned maxFrameSizeInMB = std::max(LevelInfo[levelIndex].m_MaxFrameSize, String2Unsigned(original[MaxFS.m_name])); ClampSizes(LevelInfo[levelIndex], String2Unsigned(original[PLUGINCODEC_OPTION_MAX_RX_FRAME_WIDTH]), String2Unsigned(original[PLUGINCODEC_OPTION_MAX_RX_FRAME_HEIGHT]), maxFrameSizeInMB, original, changed); // Frame rate unsigned maxMBPS = std::max(LevelInfo[levelIndex].m_MaxMBPS, String2Unsigned(original[MaxMBPS.m_name])); ClampMin(GetMacroBlocks(String2Unsigned(original[PLUGINCODEC_OPTION_MIN_RX_F RAME_WIDTH]), String2Unsigned(original[PLUGINCODEC_OPTION_MIN_RX_FRAME_HEIGHT]))*MyClockRa te/maxMBPS, original, changed, PLUGINCODEC_OPTION_FRAME_TIME) changed to : unsigned maxFrameSizeInMB = std::max(LevelInfo[levelIndex].m_MaxFrameSize, String2Unsigned(original[MaxFS.m_name]) * 256); ClampSizes(LevelInfo[levelIndex], String2Unsigned(original[PLUGINCODEC_OPTION_MAX_RX_FRAME_WIDTH]), String2Unsigned(original[PLUGINCODEC_OPTION_MAX_RX_FRAME_HEIGHT]), maxFrameSizeInMB, original, changed); // Frame rate unsigned maxMBPS = std::max(LevelInfo[levelIndex].m_MaxMBPS, String2Unsigned(original[MaxMBPS.m_name]) * 500); ClampMin(GetMacroBlocks(String2Unsigned(original[PLUGINCODEC_OPTION_MIN_RX_F RAME_WIDTH]), String2Unsigned(original[PLUGINCODEC_OPTION_MIN_RX_FRAME_HEIGHT]))*MyClockRa te/maxMBPS, original, changed, PLUGINCODEC_OPTION_FRAME_TIME); - In OpalMediaPatch:Main, there is a delay set at 10 ms, which made the video freeze/stagger at HD on my machine - I7/4GB Ram. Because of the delay, the packets from the rtp video stream were not read fast enough so after a while they were overwritten. Upon commenting the delay, the video stream played smoothly. I'm still assessing the library so these are merely observations - it's possible i've not understood it as well as i should have. If it's a bug, i'll submit a patch for it but i'm asking to see if i'm not missing anything. I'm aware that having a HD conference with so little modifications is an amazing feat from all the opal dev team. The issue i have now is that the audio and video streams are not synchronized. I saw in the Current ToDo List that the audio-video sync is possible but not a focus at this moment. I am wondering whether it were possible to use DirectShow to do the audio/video mixing, as they have synchronization mechanisms already defined, and also because there already are filters for mixing/transforming video streams. Also, some graphic card manufacturers have DirectShow filters able to decode/encode H.264 in hardware. If anyone has tried something like that before, please let me know if it worked and maybe how it can be done. Thanks for your time. Alex Parpauta |