Re: [Mlt-devel] MLT for a mixed live and file based producer project
Brought to you by:
ddennedy,
lilo_booter
From: Louis S. <lou...@gm...> - 2011-08-18 00:39:12
|
On Wed, Aug 17, 2011 at 8:23 PM, Dan Dennedy <da...@de...> wrote: > > On Wed, Aug 17, 2011 at 3:40 PM, Louis Simons <lou...@gm...> wrote: > > I've been working on an open source video playout and real-time titling > > package similar to the features of Newtek's Video Toaster. I've been > > testing out pieces of implementation with Gstreamer as the backend and have > > been finding that it is very fragile when it comes to combining realtime > > (I've been using video4linux) and prerecordered sources (files). I've got a > > case of the grass is always greener and was hoping to give MLT a try, since > > looking at the documentation, it seems like a much simpler interface, > > without the background complexities of first mastering glib. On top of > > that, if I understand the build process right, MLT treats higher-level > > language bindings like first class citizens, and I would've loved to work > > Not exactly first-class since some C features are not fully available > in Ruby, and they are generated such that they did not get the full > "ruby-way" treatment. You will be ok as long as you do not need to get > the binary audio and video into Ruby. Of course, if/when you ever need > to, you can improve the binding in that area. At least they are always > up-to-date when you enable them in your build, but typically only the > python binding is available in distro packages because OpenShot needs > it. That sounds good enough to me. If feel like if I need to deal with the binary audio/video, it would make more sense to go down to a C module for that. > > > with Gstreamer in Ruby if only the bindings were up to date. > > From reading the mailing list archives, I found a reference to February 2011 > > saying that live sources were pretty badly broken. I was wondering if that > > is still the case? Also, from the wiki fundamentals of MLT page, I couldn't > > That was really in reference to libavdevice live inputs, and that is > no longer true. Project sponsors have funded to have that fixed, and > now video4linux2, alsa, and network streams work good. Also, there is > DeckLink SDI and HDMI input. DV/FireWire works via pipe input, but HDV > is flaky. > > > find much information on using live sources with transitions or mixing live > > and file producers. Is mixing live and file producers outside the scope of > > Some info about live sources is in the FAQ. For avformat-based > sources, it follows FFmpeg docs fairly closely. For DeckLink, you use > "decklink:". You can mix file with live very well. The watermark > filter is simpler to use that multiple tracks and a composite filter: > melt noise: -filter watermark:demo/watermark1.png > > > MLT? I'm going to be digging into MLT to learn more, but would greatly > > appreciate if someone could warn me I'm barking up the wrong tree with the > > framework. > > It is not the wrong tree at all, but prepare to do a lot of digging as > there are not a lot of people here to help answer questions, and I am > already stretched thin. I appreciate you answering my questions so quickly. At this point I'm just going to focus on getting it to compile and head back up the learning curve. It's nice to know that I'm not running myself into a corner on this project. > > -- > +-DRD-+ |