Asking help...

Help
2007-10-05
2013-06-03
  • Carlos Hernandez S.

    I know this is an open source project... but is there someone would like to support this?

    I mean, I'd asked some help many times before... but nobody reply....

    thnks

     
    • Marcos F. J.

      Marcos F. J. - 2007-10-10

      Hello Carlos, u are not alone, no more! Welcome to the growing community of "We are lost in this project" ;) hehe

      Indded, I feel the same thing, no much about this project are publicated, many people prefer retain their experiencies (or by a personal decision, or because they are shy, or by imposing of the work internal laws)

      I'm trying to put some 'shake' in this forum. If you have any questions or want to put your experience with this project, feel free to it (...please!)

       
    • Carlos Hernandez S.

      I dont kown what is going on here... this project is very interesting... but almost no body say "I can help you"...  or just say something...

      I feel good with you post, thus I will post what i want to do.

      In the Voice and video over IP only Endpoint with signaling capabilities can interact each other. Now, I wnat to make some "hide" endpoint that can hear and see the audio and video, therefore send both to an streaming server.

      The setup would be this way:

      (H323 || SIP) endpoint -> Mp4live (encoding and "hinting" audio and video for mpeg4) -> Darwin streaming server (may be Catra Streaming sever with bandwith adjusting) -> web page with player embedded (mpeg4 and rtsp compatible).

      from mp4live to client player is already done. The work is in Opal and Pwlib to dump the audio and video.

      This way, the idea is using simpleOpal do this:

      - simpleOpal make a call to another endpoint (like a said Sip Or h323)

      - while the remote video and audio is being received send it to a file, mmm.. really would be a named pipe using raw video (yuv420 maybe) and raw audio(pcm 16 maybe).

      - Using this named pipe, another application can read from this, thus I want to use "V4l2 virtual device project". This create a V4l2
      virtual device which can be used by any other application that read only this kind of raw video source using V4l2.

      - Mp4live can read the raw video from this v4l2 virtual device.

      - About audio I don't know how to do this right now ( in fact the video too, but I got some idea (your post)), but I know ALsa have a loopback driver. I think this may be the solution.

      In conclusion, transform the signaling environment whatever it be, to an streaming environment. Using Pwlib, Opal, mp4live, DSS, and some player.

      ufff... 

      any suggestion?

      (sorry my english, Im just learning this language)

       
    • Marcos F. J.

      Marcos F. J. - 2007-11-01

      Hi CarlosH

         Have another guy with same question like you. The only mean i can see to capture the streans of audio and video is to make yourself Channel descendants for use with OpenH323/OPAL.
         You can see some example of a  handmake audio channel in http://toncar.cz/openh323/tut/, there are a link for the pt-BR version too.

       
    • Marcos F. J.

      Marcos F. J. - 2007-11-01

      O my, Im snooring... the other post is your post....I'm need a coffe........

       
    • Carlos Hernandez S.

      ¬¬0 ... not a problem... :p ...  just one more question..

      do you know about YUVFile driver?... I say this because I've passed --displaydriver "YUVFile" driver parameter and --display "file.yuv" parameter to simpleOpal and the video is saved to a raw yuv420p file.. just what I want... great!!.. :-) .... but with audio there is not a driver like "WavFile" :P ... that would be amazing...

      I will try doing what you said about Channel descendants for audio...

      thanks...

       
    • Marcos F. J.

      Marcos F. J. - 2007-12-20

      Try not use YUVFile driver or any of these types, they use underline system files to share data between process. Sharing a file implies in the use of the harddrive, the worst thing you can get for streaming because the high levels of read/write latency.

      Pipes are cool, in the POSIX world a opened file in this way are a pipe, that use RAM to transfer the data, but you can't assegure this in others enviroments (like Windows)

      The best way you can do something is derivate a PVideoOutputDevice or a PAudioOutputDevice and hand write your own 100% guaranted solution. I make this to transfer the video data into the Java VM's world.

      The basic think in these calsses are to overwrite the Open, Start, Stop and SetFrameData method. The SetFrameData receive the Raw data already decoded from the CODEC.

      The code below is a modified part of my code of how to open a video channel in H323EndPoint, this code is a part of the sub project RNP-2472, and it is under GNU General Public License.

      //To do: create, configure and open a video channel out here, put here only channel attaching
      BOOL MyEndPoint::OpenVideoChannel(H323Connection &, BOOL isEncoding, H323VideoCodec & codec) {
      PVideoChannel *channel = new PVideoChannel();
      if(isEncoding) {
        if(!sendVideo){delete channel; return FALSE;}
         videoChannelRecordDevice = progConf.videoDevice;
          if((videoChannelRecordDevice == NULL) || (videoChannelRecordDevice *= "null"))
           videoChannelRecordDevice = "fake";
          grabber = PVideoInputDevice::CreateDeviceByName(videoChannelRecordDevice, NULL);
          if(grabber==NULL){delete channel; return FALSE;}
          channel->AttachVideoReader(grabber, FALSE);
          if(!channel->GetVideoReader()->Open(videoChannelRecordDevice, TRUE)){delete channel; delete grabber; return FALSE;}
          channel->GetVideoReader()->SetVideoFormat(PVideoDevice::Auto);
          channel->GetVideoReader()->SetColourFormatConverter("YUV420P");
          channel->GetVideoReader()->SetFrameSize(PVideoDevice::CIFWidth, PVideoDevice::CIFHeight);
          channel->GetVideoReader()->SetFrameRate(15);
          codec.SetTxQualityLevel(videoQuality);
          codec.SetTxMinQuality(10);
          codec.SetBackgroundFill(0);
          if(videoBitRate != 0)
           codec.SetMaxBitRate((unsigned)videoBitRate);
          if(adaptivePacketDelay)
           codec.SetVideoMode(H323VideoCodec::AdaptivePacketDelay);
      } else { //not encoding
        videoChannelPlayDevice = "SDL";
        channel->AttachVideoPlayer(PVideoOutputDevice::CreateDeviceByName(videoChannelPlayDevice, NULL), FALSE);
        if(!channel->GetVideoPlayer()->Open(videoChannelPlayDevice, TRUE)){delete channel; return FALSE;}
      }
      return codec.AttachChannel(channel, TRUE);
      }

       

Log in to post a comment.

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:





No, thanks