mediapipe-devel Mailing List for MediaPipe
Status: Beta
Brought to you by:
mobodo
You can subscribe to this list here.
2002 |
Jan
|
Feb
|
Mar
(16) |
Apr
|
May
(4) |
Jun
(1) |
Jul
(2) |
Aug
|
Sep
|
Oct
(17) |
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2004 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
|
Oct
|
Nov
|
Dec
|
2005 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(1) |
2006 |
Jan
(1) |
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Rick B. <bl...@vo...> - 2006-02-13 21:53:55
|
MovieLand home page says it is a MediaPipe program. =20 I tried MovieLand and did like it YOUR PROGRAM would not let me CANCEL = in YOUR THREE DAY'S. =20 YOUR payment reminders are nothing more then THREATS and HARRASSMENT! =20 California law forbids THREATS and HARRASSMENT for collection of = payment! =20 I want the THREATS and HARRASSMENT VRIUS that you call "payment = reminders" to stop now! =20 Rick Brown |
From: Tina A. T. <ti...@ea...> - 2006-01-09 14:22:33
|
To whom this concerns, I have an annoying pop up of a blonde saying we = owe money for movieland which we supposed to have signed up for. Please = remove it. I have young children and don't like these pop ups. Please = respond to me as soon as possible. Thank you Tina Thomas = ti...@ea... |
From: Victor V. <vvi...@sb...> - 2005-12-29 03:38:31
|
to whom it may concern My computer does not support your software, I need to cancel before i get charged for the part that was downloaded in my computer, please let me know if I am in the wrigth place. thank you |
From: Gaspard P. <mo...@us...> - 2004-08-05 12:23:49
|
Hello Christian, MediaPipe is a semi-open initiative. The source code for the application and plugins is available, but the MediaPipe framework's are not. You should, however, be able to compile most of what is in the CVS with the exception of the MPipes09 folder by simply linking to the framework provided with the application. The MPipes09 folder contains the next generation pipes for the not yet release framework. If you are having trouble with a specific directory, I can have a look - but to be sincere I haven't compiled anything other than MPipes09 in months... Regards, Gaspard On Aug 5, 2004, at 7:05 AM, Christian Weykopf wrote: > Hi, > > I'm trying to compile the whole thing from CVS - without success. > The framework sources are missing. > Why? > Where can I get them? > > have a nice day > Chris > > -- > Christian Weykopf > > <http://www.weykopf.de> > > > > ------------------------------------------------------- > This SF.Net email is sponsored by OSTG. Have you noticed the changes on > Linux.com, ITManagersJournal and NewsForge in the past few weeks? Now, > one more big change to announce. We are now OSTG- Open Source > Technology > Group. Come see the changes on the new OSTG site. www.ostg.com > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Christian W. <cw...@we...> - 2004-08-05 11:05:29
|
Hi, I'm trying to compile the whole thing from CVS - without success. The framework sources are missing. Why? Where can I get them? have a nice day Chris -- Christian Weykopf <http://www.weykopf.de> |
From: <koj...@gm...> - 2003-06-24 02:27:06
|
<html> <head> <meta http-equiv=3D"Content-Language" content=3D"it"> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Dwindows-= 1252"> <meta name=3D"GENERATOR" content=3D"Microsoft FrontPage 4.0"> <meta name=3D"ProgId" content=3D"FrontPage.Editor.Document"> <title>Nuova pagina 1</title> </head> <body bgcolor=3D"#FF0000" text=3D"#FFFFFF" link=3D"#FFFF00" vlink=3D"#00FF= FF" alink=3D"#FFFFFF"> <div align=3D"center"> <center> <table border=3D"0" cellpadding=3D"3" cellspacing=3D"3"> <tr> <td> <p align=3D"center"><a href=3D"http://www.geocities.com/format8649= 5/"> <img border=3D"0" src=3D"http://www.geocities.com/stephon69111/t05= jpg"></a><br> <font face=3D"Arial Black" size=3D"2"><b><a href=3D"http://www.geo= cities.com/thorsten83386/"><font color=3D"#FFFF00"> FOR YOU ONLY</font></a></b></font></td> <td><p><font face=3D"Arial Black" size=3D"2"><b><font color=3D"#FFFF= FF">Hello friends,<br> </font></b></font><b><font face=3D"Arial Black" color=3D"#FFFFFF" = size=3D"2">I do it just to satisfate my pleasure, not for money!</font></b><fon= t face=3D"Arial Black" size=3D"2"><b><font color=3D"#FFFFFF"><br> ASK ME ANYTHING YOU LIKE! THERE IS NO LIMIT!<br> <a href=3D"http://www.geocities.com/marian84228/"> TRY IF I AM ON LINE<br> </a>Meet me on line when you have time!</font></b></font><b><font = face=3D"Arial Black" color=3D"#FFFFFF" size=3D"2"><br> Giusy85</font></b></p> </td> </tr> </table> </center> </div> <p> </p> <p> </p> <p> </p> <p> </p> <p> </p> <p><b><font face=3D"Arial Black" color=3D"#FFFFFF" size=3D"2">Are you not = interested?<br> </font><a href=3D"mailto:dzo...@gm..."><font face=3D"Arial Black" = color=3D"#FFFFFF" size=3D"1"> Leave</font></a></b></p> <p><b><font face=3D"Arial Black"><br> </font></b></p> <p> </p> <p> </p> </body> </html>figojnzalo f r f ppf pf jver rnalpgtngu pcucslv svmcppksc ef |
From: Viperman <vip...@us...> - 2002-10-09 04:19:37
|
Something like this would ROCK! The modify pipe window is a little wacky, but I just generally think it should have it's own window. What do you think? I worked really hard on it. Doku |
From: Viperman <vip...@us...> - 2002-10-09 03:40:40
|
On Tuesday, October 8, 2002, at 10:05 PM, Gaspard Petit wrote: > I'm very glad that you worked it out =) (sorry for not responding, I > was away for the last few days). BTW, if you want me to have a look > at your code to make sure everything is alright, feel free to send me > a copy... But what I have seen looked great! If there is anything that > I can do to further help you, just ask =) I'm pretty sure I got most of it now... I just had to do a lot of digging in the .h files to get to all the nitty gritty of what certain functions were returning. The last little piece of the puzzle came together today, but I had one question left. To make a pipe, you first make a base class, called YourPipe, then a YourPipeInstance : YourPipe. Does YourPipe's variables remain constant? Or is this class created new for every frame? I'm pretty sure it remains, since that's where I put my framecounter, but my pix maps I was holding over till the next iteration kept ending up black. Chances are, that was when I had bad loop variables (the whole width * 4 thing threw me for a loop till I figured out it was pulling 8 bits of a 32 bit color at a time). Oh! Also, what's the method general for handling 12-bit color? or is there a way to specify that the color must be at at least 24-bit in a stream derived pipe? can I just use a uint8, and then 4 bools? *lol* I forget if there's a nibble size type in C++... Thanks for the comments on the de-interlacer. believe me, there's worse horizontal pans in that Anime.... They look horrible with the jerking, due to the extra frame. > > Congrats on your first pipe! (it wasn't that hard, was it? ;-) ) Nope, and I hope you include my deinterlacer in the release when I get it finished (there will be Drop, Dupe, Blend, Weave and PALConvert in it). I'm going to put the watermaker on the back burner, how about a pipe that moves to a certain point of the stream? I can write that with that member function of the stream class I believe. Anyway, the pipes so far that I would like to work on are the following: The Deinterlacer which is about 60% done A watermaker which I need to learn how to do Bitmap reads to get the file IN the watermarker.... I can read the bitmap, it's the header I don't have a layout for. Wanted to change the FrameRateConverter to also accept standard framerates and convert the Scale. > > Oh! and if you have comments about what is good and what sucks about > the interface, make sure you tell me (I am in the process of > redesigning the SDK for 0.9, so all comments are very appreciated) The SDK interface or the interface of the program? Well, the SDK was just fine, I think that you should include a blank project for streams and frames with some of the required member functions laid out, I can do that if you want, but other than that, it just needs a little explanation, especially a warning about ALWAYS watching your color depth for loops. Also, I think your base classes need flags to specify what color depth it can accept, to improve system stability. For example, if MyPipe can only accept 24 and 32 bit color, it shouldn't be available in YU12 mode, simply because it's a stream pipe. (or maybe there's a way to do that that I don't know about...., if there is, then they need to be required for at least stream pipes) The interface of the program is OK, but I believe that it should function more like VirtualDub. Most video editing/conversion utilities have two windows, so you can see what you are doing. Lining up the deinterlace lines sucked, because I was running at about 35 FPS, and I couldn't pause fast enough to see individual frames, so a framestepper from original to piped is a "must" for my kind of work. I think it would be neat if you showed how the pipes affected it on a frame by frame basis if someone was debugging thier pipe as well. Just a debug button that would show a preview for each pipe, scaled down to fit on the screen in an orderly fashion, or maybe a window with a step button to move to the next pipe, and it would change the display depending on what the next pipe was. Just a handy tool for debugging. Also, while I really REALLY like Drawers, I think you got it backwards.... the video needs to be the window, and the pipes need to be in the Drawer. this way, when pipes are completed, you should be able to easily switch between pipes with a dropdown menu somewhere on the main, and only open the drawer when you need to modify the pipe. OH, and this is a BIG one for me, the File opener and File Streamer... I really think those need to be integrated into the program, and not be pipes. It makes the concept of making pipes data dependent, and I want to use my pipes over and over again. I think the input stream should be asked of the user before start can be depressed. I see MediaPipe as a library of scripts of different pipes to run on the video, so, everytime I rip a DVD for example, there's a set of actions I want to do for each DVD, and that needs to never change, except for the filename. I can see how this program evolved, I bet there wasn't even a preview drawer for a long time. lol, it's just how I would write it: make the pipes work, then make the interface user friendly. I really like the program, and even if these things didn't happen, I would use it, and program for it, but every video editing software I have ever used has a before and after picture, the filters (or pipes in this case) are scripts that are loaded and not modified and are not data dependent. I can see this program evolving to being much better than VDub ever was, especially since YOUR CODEC library isn't dependent on what's "installed" on the system. Just load up the pipe and go. BTW: If you can convince SOMEONE to write a Divx ;-) 3.11 Audio Decoder for Media Pipe, that would be SO awesome to fix bad Divx AVI's for QuickTime export. I'll draw up the interface I would like to see in InterfaceBuilder and email it to you so you can look at it, altho, I know absolutly NOTHING about interface programming. Just a suggestion in living color to look at. Hope that helps! Doku |
From: Gaspard P. <mo...@us...> - 2002-10-09 03:06:44
|
On Tuesday, October 8, 2002, at 10:22 PM, Viperman wrote: > PAL conversion deinterlace is finally written, check out the results: > > http://homepage.mac.com/nicholaswong/bad.html > This is standard resize and drop A de-intlace, the cleanest way to do > it without bluring. > > http://homepage.mac.com/nicholaswong/good.html > This is the FFFII -> FFFF method that I made. What do you think??? > > > Doku I'm very glad that you worked it out =) (sorry for not responding, I was away for the last few days). BTW, if you want me to have a look at your code to make sure everything is alright, feel free to send me a copy... But what I have seen looked great! If there is anything that I can do to further help you, just ask =) Congrats on your first pipe! (it wasn't that hard, was it? ;-) ) Oh! and if you have comments about what is good and what sucks about the interface, make sure you tell me (I am in the process of redesigning the SDK for 0.9, so all comments are very appreciated) |
From: Viperman <vip...@us...> - 2002-10-09 02:22:30
|
PAL conversion deinterlace is finally written, check out the results: http://homepage.mac.com/nicholaswong/bad.html This is standard resize and drop A de-intlace, the cleanest way to do it without bluring. http://homepage.mac.com/nicholaswong/good.html This is the FFFII -> FFFF method that I made. What do you think??? Doku |
From: Viperman <vip...@us...> - 2002-10-06 02:23:42
|
Alright, I got the de-interlace methods written for the standard Deinterlacer: Drop A, Drop B, Dup A, Dup B, All frames interlaced A-> and All frames interlaced B-> (this is probably called weaving, I don't know all the names). I'll just put the original pipe as blend B, and make a blend A... As far as I know, that pretty much covers it. If there's a way to detect interlacing, I sure don't know how to do it, but if you know some method of detecting it, (it may be embeded into MPEG2 somewhere as a flag, I dunno, but I know MPEG2 was adopted because it supported it) The methods I have yet to write due to the stream questions I posted earlier are PAL Conversion Deinterlace and Convert to 2x FPS (which is what the TV does anyway). I also have no idea how to drive Carbon, but I designed the interface for it in Interface Builder. Not going to try to put the two together till all the methods are written and tested. Also, I had some feature requests... on the preview drawer, can there be a step frame button? Also, if possible, can there be a before image there too? So that you can see what the pipes are doing? And most helpful of all would be a start and end marker, so that one can do a piece of the clip instead of the whole thing. Thanks! Doku |
From: Gaspard P. <mo...@us...> - 2002-10-06 01:39:16
|
Oh! I see =) I think you are getting confused because of the two examples that I gave you... the MPImageFilterPipe does modify the content of the image, but not the amount of frames, and the MPSISOPipe does modify the number of frames and not their content... The MPImageFilterPipe is written on top of the MPSISOPipe - changing the content of a frame is easy =) copyFrameSetting() does not only copy the image, it also copies the frame duration, time scale and everything else associated with the image. You usually want to call this first. If you want to change the image, you would normally want to have your own image buffer and then, you would simply change the output stream's buffer with yours. It would like something like this (I use the prefix 'm' for member variables): (Assuming that you support only ARGB32 for both input and output) MyPipe:: MyPipe() : mBuffer(NULL) {...} MPError MyPipe::setInput(...) { delete mBuffer; mBuffer = new char[input->getData()->pixMap.width* input->getData()->pixMap.height*4]; // now you have a buffer that can hold width*height pixels return kMPNoErr; } MPError MyPipe::processFrame(...) { output->copyFrameSetting(input); // for this example, I'll invert the colors for (int y = 0; y < input->getData()->pixMap.height; y++) { for (int x = 0; x < input->getData()->pixMap.width; x++) { mBuffer[x + y*width*4] = 255 - input->getData()->pixMap.baseAddr[x + y* input->getData()->pixMap.rowBytes]; } } // change the buffer of the output output->getData()->pixMap.baseAddr = mBuffer; output->getData()->pixMap.rowBytes = width*4; // don't forget this too! return kMPNoErr; } I hope this answers what you are asking... if by "making your own stream and build this frame" you meant allocate your own buffer and have the output stream point to that, then yes, you have to do that =) In theory, you *could* mess directly in the input image buffer... but then bad things can happen if the previous pipe expects its buffer to be unchanged then next time it processes something, so it's better to work with your own memory block... On Saturday, October 5, 2002, at 12:39 PM, Viperman wrote: > Ok, there was one question in that big long paragraph that I think I > didn't make clear... > > Apparently there are two ways (so far) that I've seen to handle input. > If you derive your class from MPSISOPipe, you get stream input, and if > you derive it from MPImageFilterPipe, you get frame by frame. With > MPSISOPipe, you can change the properties of the stream on a frame > basis, duplicate the frames by calling copyFrameSettings multiple > times or drop frames by processing frames from the input, but if I > need to modify a frame in that input, is there some way to change the > pixMap of the input stream? What I'm doing with the PAL Deinterlace is > kind of strange, and this is why I've been fighting de-interlacing for > 6 months. > > When some video is converted from PAL to NTSC, they decide the best > way to change the framerate from 23.9 to 29.9 is to add 5 frames of > interlacing. So, for ever 4 frames FFFF (F = full frame) They take one > of them, I'll use the third as an example, and interlace the B into > the previous A and the A into the next B, so you end up with FFIIF > (where I = Interlaced) or, A1B1, A2B2, A2B3, A3B4, A4B4. So my PAL > Deinterlacer combines back the AB Frames to make A1B1, A2B2, A3B3, > A4B4, and thus, has to both drop 2 frames and insert a newly > constructed frame from the two parts in the interlacing. WHEW! I got > it working on VirtualDub, but that doesn't allow me to change > framerates, so I had to duplicate one of the 3 interpreted frames, and > it looked jumpy. > > Anyway, long story short, I see how to drop the two bad frames, that's > easy enough, but I don't see how to copy anything into the output > stream OTHER than the input stream, which is const. So, do I need to > make my own stream and build this frame, then copy frame settings? Or > is there a way to pass a frame into the output stream to be added? > > Thanks! The rest of the questions will get the other De-Interlacers > written. :) > > Doku |
From: Viperman <vip...@us...> - 2002-10-05 16:39:07
|
Ok, there was one question in that big long paragraph that I think I didn't make clear... Apparently there are two ways (so far) that I've seen to handle input. If you derive your class from MPSISOPipe, you get stream input, and if you derive it from MPImageFilterPipe, you get frame by frame. With MPSISOPipe, you can change the properties of the stream on a frame basis, duplicate the frames by calling copyFrameSettings multiple times or drop frames by processing frames from the input, but if I need to modify a frame in that input, is there some way to change the pixMap of the input stream? What I'm doing with the PAL Deinterlace is kind of strange, and this is why I've been fighting de-interlacing for 6 months. When some video is converted from PAL to NTSC, they decide the best way to change the framerate from 23.9 to 29.9 is to add 5 frames of interlacing. So, for ever 4 frames FFFF (F = full frame) They take one of them, I'll use the third as an example, and interlace the B into the previous A and the A into the next B, so you end up with FFIIF (where I = Interlaced) or, A1B1, A2B2, A2B3, A3B4, A4B4. So my PAL Deinterlacer combines back the AB Frames to make A1B1, A2B2, A3B3, A4B4, and thus, has to both drop 2 frames and insert a newly constructed frame from the two parts in the interlacing. WHEW! I got it working on VirtualDub, but that doesn't allow me to change framerates, so I had to duplicate one of the 3 interpreted frames, and it looked jumpy. Anyway, long story short, I see how to drop the two bad frames, that's easy enough, but I don't see how to copy anything into the output stream OTHER than the input stream, which is const. So, do I need to make my own stream and build this frame, then copy frame settings? Or is there a way to pass a frame into the output stream to be added? Thanks! The rest of the questions will get the other De-Interlacers written. :) Doku |
From: Gaspard P. <mo...@us...> - 2002-10-05 13:06:50
|
On Saturday, October 5, 2002, at 03:00 AM, Viperman wrote: > question 1: What does MPStreamOpaqueStruct__::processFrameAsync() do? > Drop a frame? in beginSequence, you are given the first frame of the sequence. From there after, you are responsible for fetching the next frame. For example, you could use the first frame three of four times in processFrame, and then decide that you need a new frame to process... (this is just an example, you normally want a new frame for every processFrame). There are two ways to obtain the next frame: call MPStream->processFrame - this blocks your processing until the new frame is processed call MPStream->processFrameAsync - this tells MediaPipe to process a new frame in a separate thread but doesn't block you. Once you really really need the next frame, you call MPStream->processFrameWait and then MPStream->processFrameGetError. Typically, a "processFrame" loop can look either like this: MPError MyPipe::beginSequence(..) { isFirstFrame = true; } MPError MyPipe::processFrame(...) { if (isFirstFrame) isFirstFrame = false; else { err = inStream->processFrame(); if (err) return err; } // do your processing here... } or like that MPError MyPipe::processFrame(...) { err = inStream->processFrameWait(); if (err) return err; err = inStream->processFrameGetError(); if (err) return err; // do your processing here... err = inStream->processFrameAsync(); if (err) return err; } Notice that the second version, although slightly more complicated, will be more efficient especially if there is IO involved or if the user has a multi-processor system. Also, remember that it is very important that you check for errors, as you will get one if there is no more frame in the sequence... > question 2: If I step through a stream using MPStreamOpaqueStruct__:: > processFrame() and snarf the data with MPStreamOpaqueStruct__:: > getData(), can I build my own stream? If so, is there a way to just > insert a single frame using MPStreamOpaqueStruct__:: > copyFrameSettings(), such as inserting a stream with one frame in it? > Is this system designed so that I can process multiple frames in one > processFrame call of my custom class? (I've been looking in depth at > the ChangeFrameRate pipe, tho counting frames by Scale and Duration is > sort of new to me, scale being how many durations fit per second, and > duration being, I dunno... miliseconds the frame is displyed? Oh well, > I don't need to know exactly what they mean, just the math to change > them lol) You can only output 1 frame per processFrame. However, you are in control of your input - so you can reuse the same input frame many times for many output frames... for example: MPError MyPipe::processFrame(...) { if (i < 5) { i++; return outStream->copyFrameSettings(inStream); } i = 0; err = inStream->processFrame(); if (err) return err; return outStream->copyFrameSettings(inStream); } This will output 6 frames for each input frame... In the ChangeFrameRate, we use a "rational number" approach for framerate... since some frame rate are very weird - for example, NTSC which is 29.97... but NTSC is actually 30000/1001 frames per second. So in this case you would use a scale of 30000, and each frame would have a duration of 1001... In other words: In 1 second, you have 30000 "time units" In 1 frame, you have 1001 units... Thus, you find that in 1 second, you will have exactly 30000/1001 frames I know it's weird the first time you use this system... =) > OK, that was a long question... I also had more questions short I > promise. > > is pixMap->rowBytes the offset between rows AND the ammount of bytes > of data in the row or the number of bytes of data in the row alone? > (After programming for VDub, this is a valid question. ;) rowBytes would normally be width*pixelSize. However, for a better alignment, you often want the first pixel of each row to be on a multiple of 32 bit clean address, so sometimes, the rowBytes is slightly bigger than the width*pixelSize. You can also use the rowBytes to focus on a sub image within an image (consider if I have a large image but give you the baseAddress with an offset, a smaller height and width. If the rowBytes is not changed, you will *think* that I gave you a small image while I actually gave you a bigger one - that's how the crop works). So when changing row, you should use rowBytes, not width*pixelSize... > also, after modifying the outframe to different dimensions, > outImage->height, outImage->width and outImage->length, are there any > other variables I need to set for a resolution change? Everytime I > change resolution I get a crash on my pipes. outImage->rowBytes and outImage->baseAddress =) > Thanks for the help! This should help me get the Deinterlacers (all of > them) written in a few days. Cool! can't wait to see =) Don't hesitate to ask more questions... |
From: Viperman <vip...@us...> - 2002-10-05 07:00:02
|
Disregard that previous email, I'm getting most of it so far. Few quick questions... question 1: What does MPStreamOpaqueStruct__::processFrameAsync() do? Drop a frame? question 2: If I step through a stream using MPStreamOpaqueStruct__:: processFrame() and snarf the data with MPStreamOpaqueStruct__:: getData(), can I build my own stream? If so, is there a way to just insert a single frame using MPStreamOpaqueStruct__:: copyFrameSettings(), such as inserting a stream with one frame in it? Is this system designed so that I can process multiple frames in one processFrame call of my custom class? (I've been looking in depth at the ChangeFrameRate pipe, tho counting frames by Scale and Duration is sort of new to me, scale being how many durations fit per second, and duration being, I dunno... miliseconds the frame is displyed? Oh well, I don't need to know exactly what they mean, just the math to change them lol) OK, that was a long question... I also had more questions short I promise. is pixMap->rowBytes the offset between rows AND the ammount of bytes of data in the row or the number of bytes of data in the row alone? (After programming for VDub, this is a valid question. ;) also, after modifying the outframe to different dimensions, outImage->height, outImage->width and outImage->length, are there any other variables I need to set for a resolution change? Everytime I change resolution I get a crash on my pipes. Thanks for the help! This should help me get the Deinterlacers (all of them) written in a few days. Doku |
From: Viperman <vip...@us...> - 2002-10-04 17:21:22
|
Great, I got all of it now. I only have one more question, how do I include my custom pipe into MediaPipe? Thanks again for all the help, Doku On Friday, October 4, 2002, at 12:04 PM, Gaspard Petit wrote: > sorry, you're right... there was a cvs login missing if I remember > well... =) > > The 41 warnings are normal... > > If you want to checkout everything, you specify "." as the module as > in cvs -d blahblah co . > > Good luck =) > > On Friday, October 4, 2002, at 12:40 PM, Viperman wrote: > >> >> OK, I read up on the commandline CVS, and I just needed to log in >> anonymously first to be able to download modules, but I can't >> download everything, which is fine by me, I just needed the >> FrameRateChanger. >> >> I got it to compile, with 41 errors due to private destructors. I >> hope this is ok. >> >> Thanks for all the help and please let me know if I did something >> wrong.... >> >> >> Doku >> >> >> >> On Friday, October 4, 2002, at 11:09 AM, Gaspard Petit wrote: >> >>> >>> On Thursday, October 3, 2002, at 11:48 AM, Viperman wrote: >>> >>>> Ok, well, for some reason my CVS client inside Project Builder >>>> isn't working, won't even let me enable it, but that's probably due >>>> to user error. If there's some kind of setup that I need, please >>>> let me know. I've never done CVS before. >>> >>> I believe that CVS in ProjectBuilder won't work if you use anonymous >>> cvs... ProjectBuilder expects to be able to make modifications to >>> the repository, which you cannot make with anonymous CVS. >>> >>> So for now, I'd say - just disable CVS in ProjectBuilder (it will >>> ask you so) >>> >>> You can download the source code from the CVS using the command line: >>> >>> cvs -d ano...@cv...:/cvsroot/mediapipe/ >>> co . >>> >>> That should checkout everything, including the SDK (it will be in >>> Externs/MediaPipeSDK I think). You should be find the pipes I >>> mentioned and compile them without problems... >>> >>>> >>>> I downloaded the source manually from sf.net and it all looks >>>> pretty straightforward. (thinking about modifying the framerate >>>> changer to support just asking for standard framerates...) All it's >>>> missing is the SDK, so I can't compile anything. Probably going to >>>> have to hack at it for a while, but I've been fighting interlaced >>>> signal for about 6 months now. At least the carbon code is making >>>> more sense to me. I usually write CLI unix telnet servers, so all >>>> this graphics stuff is new to me. However, I also wrote another >>>> little filter for VDub you guys might want that embeds a watermark >>>> in the signal (extremely basic). If I can get a BMP of whatever >>>> watermark they want to use I'm pretty sure I can hash out a >>>> watermark/logo embedder (and I probably will for my own purposes). >>>> The interface is the hardest part for me to write, but I might as >>>> well learn Carbon/Cocca while I have projects I can mathematically >>>> handle. >>> >>> The pipes are written in C++ and they use Carbon... but you need >>> Carbon only if you write a GUI... The cool thing is - you don't >>> *have* to write a GUI (see the getSettings/setSettings). The GUI is >>> optional... >>> >>> Watermarking would be easy, but it will be even better in 0.9 where >>> you will be able to say "I want two video input". You take the >>> first, you use the second for watermarking... if the second is >>> shorter, you loop it... if it's an image, it will be repeated on >>> every frame... that way, you need not worry about loading the image >>> and its format... More on this when 0.9 is actually available... >>> Meanwhile, you could also hack something quickly if it's for your >>> own >> purposes... >>> >>> Have a look at the Deinterlacer and ChangeFrameRate pipes, and I'll >>> be around... you can also send me your code or prototype by email >>> directly if you want me to comment and help you out... >>> >>> >>>> >>>>> On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: >>>>> >>>>> Hello Doku, >>>>> >>>>> Unfortunately, the SDK has been changing too much with the last >>>>> versions of MediaPipe... 0.9 should have the final SDK and it will >>>>> be properly documented. >>>>> >>>>> Meanwhile, you can do two things. In the CVS, check out >>>>> >>>>> /MPipes/Video/Deinterlacer >>>>> >>>>> It's a very straightforward deinterlacer - it doesn't even check >>>>> if the image is really interlaced. It's something like 50 lines of >>>>> > code... >>>>> >>>>> Then, if you want to drop frames, you will want to take a look at >>>>> the framerate changer in >>>>> >>>>> /MPipes/Video/ChangeFrameRate >>>>> >>>>> This one is a tad bigger (about 250 lines) but it does settings >>>>> and configuration stuff... You will also notice that the >>>>> deinterlacer is a MPImageFilterPipe while the ChangeFrameRate is a >>>>> MPSISOPipe. You cannot drop or add frames with a >>>>> MPImageFilterPipe, but it's built on top of the MPSISOPipe... (I'm >>>>> talking about the base class of these pipes, see the .h) >>>>> >>>>> See if you can make sense out of that, and for the remaining >>>>> questions, I'll be more than happy to give you a hand =) >>>>> >>>>> Gaspard >>>>> >>>>> On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: >>>>> >>>>>> I've already written the de-interlacer for VirtualDub, and I'm >>>>>> looking to port it to Media Pipe. >>>>>> >>>>>> It finds and removes the extra "frame" generated when PAL full >>>>>> frame is converted to NSTC interlaced. >>>>>> It takes 1 of ever 4 frames in the PAL sequence and splits it >>>>>> into A and B, then uses the B from the previous and the A from >>>>>> the next to make 2 new frames with the A B of the one being >>>>>> interlaced. This happens pretty often in Anime. >>>>>> >>>>>> Thanks for the help! I hope it's a framework in ProjectBuilder! >>>>>> >>>>>> Doku >>>>>> >>>>>> >>>>>> >>>>>> ------------------------------------------------------- >>>>>> This sf.net email is sponsored by:ThinkGeek >>>>>> Welcome to geek heaven. >>>>>> http://thinkgeek.com/sf >>>>>> _______________________________________________ >>>>>> MediaPipe-devel mailing list >>>>>> Med...@li... >>>>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>>>> >>>>> >>>>> >>>>> >>>>> ------------------------------------------------------- >>>>> This sf.net email is sponsored by:ThinkGeek >>>>> Welcome to geek heaven. >>>>> http://thinkgeek.com/sf >>>>> _______________________________________________ >>>>> MediaPipe-devel mailing list >>>>> Med...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>>> >>>> >>>> >>>> >>>> ------------------------------------------------------- >>>> This sf.net email is sponsored by:ThinkGeek >>>> Welcome to geek heaven. >>>> http://thinkgeek.com/sf >>>> _______________________________________________ >>>> MediaPipe-devel mailing list >>>> Med...@li... >>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>> >>> >>> >>> >>> ------------------------------------------------------- >>> This sf.net email is sponsored by:ThinkGeek >>> Welcome to geek heaven. >>> http://thinkgeek.com/sf >>> _______________________________________________ >>> MediaPipe-devel mailing list >>> Med...@li... >>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>> >> > |
From: Gaspard P. <mo...@us...> - 2002-10-04 17:05:25
|
sorry, you're right... there was a cvs login missing if I remember well... =) The 41 warnings are normal... If you want to checkout everything, you specify "." as the module as in cvs -d blahblah co . Good luck =) On Friday, October 4, 2002, at 12:40 PM, Viperman wrote: > > OK, I read up on the commandline CVS, and I just needed to log in > anonymously first to be able to download modules, but I can't download > everything, which is fine by me, I just needed the FrameRateChanger. > > I got it to compile, with 41 errors due to private destructors. I hope > this is ok. > > Thanks for all the help and please let me know if I did something > wrong.... > > > Doku > > > > On Friday, October 4, 2002, at 11:09 AM, Gaspard Petit wrote: > >> >> On Thursday, October 3, 2002, at 11:48 AM, Viperman wrote: >> >>> Ok, well, for some reason my CVS client inside Project Builder isn't >>> working, won't even let me enable it, but that's probably due to >>> user error. If there's some kind of setup that I need, please let me >>> know. I've never done CVS before. >> >> I believe that CVS in ProjectBuilder won't work if you use anonymous >> cvs... ProjectBuilder expects to be able to make modifications to the >> repository, which you cannot make with anonymous CVS. >> >> So for now, I'd say - just disable CVS in ProjectBuilder (it will ask >> you so) >> >> You can download the source code from the CVS using the command line: >> >> cvs -d ano...@cv...:/cvsroot/mediapipe/ co >> . >> >> That should checkout everything, including the SDK (it will be in >> Externs/MediaPipeSDK I think). You should be find the pipes I >> mentioned and compile them without problems... >> >>> >>> I downloaded the source manually from sf.net and it all looks pretty >>> straightforward. (thinking about modifying the framerate changer to >>> support just asking for standard framerates...) All it's missing is >>> the SDK, so I can't compile anything. Probably going to have to hack >>> at it for a while, but I've been fighting interlaced signal for >>> about 6 months now. At least the carbon code is making more sense to >>> me. I usually write CLI unix telnet servers, so all this graphics >>> stuff is new to me. However, I also wrote another little filter for >>> VDub you guys might want that embeds a watermark in the signal >>> (extremely basic). If I can get a BMP of whatever watermark they >>> want to use I'm pretty sure I can hash out a watermark/logo embedder >>> (and I probably will for my own purposes). The interface is the >>> hardest part for me to write, but I might as well learn Carbon/Cocca >>> while I have projects I can mathematically handle. >> >> The pipes are written in C++ and they use Carbon... but you need >> Carbon only if you write a GUI... The cool thing is - you don't >> *have* to write a GUI (see the getSettings/setSettings). The GUI is >> optional... >> >> Watermarking would be easy, but it will be even better in 0.9 where >> you will be able to say "I want two video input". You take the first, >> you use the second for watermarking... if the second is shorter, you >> loop it... if it's an image, it will be repeated on every frame... >> that way, you need not worry about loading the image and its >> format... More on this when 0.9 is actually available... Meanwhile, >> you could also hack something quickly if it's for your own >> purposes... >> >> Have a look at the Deinterlacer and ChangeFrameRate pipes, and I'll >> be around... you can also send me your code or prototype by email >> directly if you want me to comment and help you out... >> >> >>> >>>> On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: >>>> >>>> Hello Doku, >>>> >>>> Unfortunately, the SDK has been changing too much with the last >>>> versions of MediaPipe... 0.9 should have the final SDK and it will >>>> be properly documented. >>>> >>>> Meanwhile, you can do two things. In the CVS, check out >>>> >>>> /MPipes/Video/Deinterlacer >>>> >>>> It's a very straightforward deinterlacer - it doesn't even check if >>>> the image is really interlaced. It's something like 50 lines of > >>>> code... >>>> >>>> Then, if you want to drop frames, you will want to take a look at >>>> the framerate changer in >>>> >>>> /MPipes/Video/ChangeFrameRate >>>> >>>> This one is a tad bigger (about 250 lines) but it does settings and >>>> configuration stuff... You will also notice that the deinterlacer >>>> is a MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. >>>> You cannot drop or add frames with a MPImageFilterPipe, but it's >>>> built on top of the MPSISOPipe... (I'm talking about the base class >>>> of these pipes, see the .h) >>>> >>>> See if you can make sense out of that, and for the remaining >>>> questions, I'll be more than happy to give you a hand =) >>>> >>>> Gaspard >>>> >>>> On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: >>>> >>>>> I've already written the de-interlacer for VirtualDub, and I'm >>>>> looking to port it to Media Pipe. >>>>> >>>>> It finds and removes the extra "frame" generated when PAL full >>>>> frame is converted to NSTC interlaced. >>>>> It takes 1 of ever 4 frames in the PAL sequence and splits it into >>>>> A and B, then uses the B from the previous and the A from the next >>>>> to make 2 new frames with the A B of the one being interlaced. >>>>> This happens pretty often in Anime. >>>>> >>>>> Thanks for the help! I hope it's a framework in ProjectBuilder! >>>>> >>>>> Doku >>>>> >>>>> >>>>> >>>>> ------------------------------------------------------- >>>>> This sf.net email is sponsored by:ThinkGeek >>>>> Welcome to geek heaven. >>>>> http://thinkgeek.com/sf >>>>> _______________________________________________ >>>>> MediaPipe-devel mailing list >>>>> Med...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>>> >>>> >>>> >>>> >>>> ------------------------------------------------------- >>>> This sf.net email is sponsored by:ThinkGeek >>>> Welcome to geek heaven. >>>> http://thinkgeek.com/sf >>>> _______________________________________________ >>>> MediaPipe-devel mailing list >>>> Med...@li... >>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>> >>> >>> >>> >>> ------------------------------------------------------- >>> This sf.net email is sponsored by:ThinkGeek >>> Welcome to geek heaven. >>> http://thinkgeek.com/sf >>> _______________________________________________ >>> MediaPipe-devel mailing list >>> Med...@li... >>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>> >> >> >> >> ------------------------------------------------------- >> This sf.net email is sponsored by:ThinkGeek >> Welcome to geek heaven. >> http://thinkgeek.com/sf >> _______________________________________________ >> MediaPipe-devel mailing list >> Med...@li... >> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >> > |
From: Viperman <vip...@us...> - 2002-10-04 16:40:39
|
OK, I read up on the commandline CVS, and I just needed to log in anonymously first to be able to download modules, but I can't download everything, which is fine by me, I just needed the FrameRateChanger. I got it to compile, with 41 errors due to private destructors. I hope this is ok. Thanks for all the help and please let me know if I did something wrong.... Doku On Friday, October 4, 2002, at 11:09 AM, Gaspard Petit wrote: > > On Thursday, October 3, 2002, at 11:48 AM, Viperman wrote: > >> Ok, well, for some reason my CVS client inside Project Builder isn't >> working, won't even let me enable it, but that's probably due to user >> error. If there's some kind of setup that I need, please let me know. >> I've never done CVS before. > > I believe that CVS in ProjectBuilder won't work if you use anonymous > cvs... ProjectBuilder expects to be able to make modifications to the > repository, which you cannot make with anonymous CVS. > > So for now, I'd say - just disable CVS in ProjectBuilder (it will ask > you so) > > You can download the source code from the CVS using the command line: > > cvs -d ano...@cv...:/cvsroot/mediapipe/ co . > > That should checkout everything, including the SDK (it will be in > Externs/MediaPipeSDK I think). You should be find the pipes I > mentioned and compile them without problems... > >> >> I downloaded the source manually from sf.net and it all looks pretty >> straightforward. (thinking about modifying the framerate changer to >> support just asking for standard framerates...) All it's missing is >> the SDK, so I can't compile anything. Probably going to have to hack >> at it for a while, but I've been fighting interlaced signal for about >> 6 months now. At least the carbon code is making more sense to me. I >> usually write CLI unix telnet servers, so all this graphics stuff is >> new to me. However, I also wrote another little filter for VDub you >> guys might want that embeds a watermark in the signal (extremely >> basic). If I can get a BMP of whatever watermark they want to use I'm >> pretty sure I can hash out a watermark/logo embedder (and I probably >> will for my own purposes). The interface is the hardest part for me >> to write, but I might as well learn Carbon/Cocca while I have >> projects I can mathematically handle. > > The pipes are written in C++ and they use Carbon... but you need > Carbon only if you write a GUI... The cool thing is - you don't *have* > to write a GUI (see the getSettings/setSettings). The GUI is > optional... > > Watermarking would be easy, but it will be even better in 0.9 where > you will be able to say "I want two video input". You take the first, > you use the second for watermarking... if the second is shorter, you > loop it... if it's an image, it will be repeated on every frame... > that way, you need not worry about loading the image and its format... > More on this when 0.9 is actually available... Meanwhile, you could > also hack something quickly if it's for your own purposes... > > Have a look at the Deinterlacer and ChangeFrameRate pipes, and I'll be > around... you can also send me your code or prototype by email > directly if you want me to comment and help you out... > > >> >>> On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: >>> >>> Hello Doku, >>> >>> Unfortunately, the SDK has been changing too much with the last >>> versions of MediaPipe... 0.9 should have the final SDK and it will >>> be properly documented. >>> >>> Meanwhile, you can do two things. In the CVS, check out >>> >>> /MPipes/Video/Deinterlacer >>> >>> It's a very straightforward deinterlacer - it doesn't even check if >>> the image is really interlaced. It's something like 50 lines of > >>> code... >>> >>> Then, if you want to drop frames, you will want to take a look at >>> the framerate changer in >>> >>> /MPipes/Video/ChangeFrameRate >>> >>> This one is a tad bigger (about 250 lines) but it does settings and >>> configuration stuff... You will also notice that the deinterlacer is >>> a MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. You >>> cannot drop or add frames with a MPImageFilterPipe, but it's built >>> on top of the MPSISOPipe... (I'm talking about the base class of >>> these pipes, see the .h) >>> >>> See if you can make sense out of that, and for the remaining >>> questions, I'll be more than happy to give you a hand =) >>> >>> Gaspard >>> >>> On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: >>> >>>> I've already written the de-interlacer for VirtualDub, and I'm >>>> looking to port it to Media Pipe. >>>> >>>> It finds and removes the extra "frame" generated when PAL full >>>> frame is converted to NSTC interlaced. >>>> It takes 1 of ever 4 frames in the PAL sequence and splits it into >>>> A and B, then uses the B from the previous and the A from the next >>>> to make 2 new frames with the A B of the one being interlaced. This >>>> happens pretty often in Anime. >>>> >>>> Thanks for the help! I hope it's a framework in ProjectBuilder! >>>> >>>> Doku >>>> >>>> >>>> >>>> ------------------------------------------------------- >>>> This sf.net email is sponsored by:ThinkGeek >>>> Welcome to geek heaven. >>>> http://thinkgeek.com/sf >>>> _______________________________________________ >>>> MediaPipe-devel mailing list >>>> Med...@li... >>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>> >>> >>> >>> >>> ------------------------------------------------------- >>> This sf.net email is sponsored by:ThinkGeek >>> Welcome to geek heaven. >>> http://thinkgeek.com/sf >>> _______________________________________________ >>> MediaPipe-devel mailing list >>> Med...@li... >>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>> >> >> >> >> ------------------------------------------------------- >> This sf.net email is sponsored by:ThinkGeek >> Welcome to geek heaven. >> http://thinkgeek.com/sf >> _______________________________________________ >> MediaPipe-devel mailing list >> Med...@li... >> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >> > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Viperman <vip...@us...> - 2002-10-04 16:27:34
|
:( It says that I need to specify a module??? cvs [checkout aborted]: must specify at least one module or directory Doku On Friday, October 4, 2002, at 11:09 AM, Gaspard Petit wrote: > > On Thursday, October 3, 2002, at 11:48 AM, Viperman wrote: > >> Ok, well, for some reason my CVS client inside Project Builder isn't >> working, won't even let me enable it, but that's probably due to user >> error. If there's some kind of setup that I need, please let me know. >> I've never done CVS before. > > I believe that CVS in ProjectBuilder won't work if you use anonymous > cvs... ProjectBuilder expects to be able to make modifications to the > repository, which you cannot make with anonymous CVS. > > So for now, I'd say - just disable CVS in ProjectBuilder (it will ask > you so) > > You can download the source code from the CVS using the command line: > > cvs -d ano...@cv...:/cvsroot/mediapipe/ co . > > That should checkout everything, including the SDK (it will be in > Externs/MediaPipeSDK I think). You should be find the pipes I > mentioned and compile them without problems... > >> >> I downloaded the source manually from sf.net and it all looks pretty >> straightforward. (thinking about modifying the framerate changer to >> support just asking for standard framerates...) All it's missing is >> the SDK, so I can't compile anything. Probably going to have to hack >> at it for a while, but I've been fighting interlaced signal for about >> 6 months now. At least the carbon code is making more sense to me. I >> usually write CLI unix telnet servers, so all this graphics stuff is >> new to me. However, I also wrote another little filter for VDub you >> guys might want that embeds a watermark in the signal (extremely >> basic). If I can get a BMP of whatever watermark they want to use I'm >> pretty sure I can hash out a watermark/logo embedder (and I probably >> will for my own purposes). The interface is the hardest part for me >> to write, but I might as well learn Carbon/Cocca while I have >> projects I can mathematically handle. > > The pipes are written in C++ and they use Carbon... but you need > Carbon only if you write a GUI... The cool thing is - you don't *have* > to write a GUI (see the getSettings/setSettings). The GUI is > optional... > > Watermarking would be easy, but it will be even better in 0.9 where > you will be able to say "I want two video input". You take the first, > you use the second for watermarking... if the second is shorter, you > loop it... if it's an image, it will be repeated on every frame... > that way, you need not worry about loading the image and its format... > More on this when 0.9 is actually available... Meanwhile, you could > also hack something quickly if it's for your own purposes... > > Have a look at the Deinterlacer and ChangeFrameRate pipes, and I'll be > around... you can also send me your code or prototype by email > directly if you want me to comment and help you out... > > >> >>> On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: >>> >>> Hello Doku, >>> >>> Unfortunately, the SDK has been changing too much with the last >>> versions of MediaPipe... 0.9 should have the final SDK and it will >>> be properly documented. >>> >>> Meanwhile, you can do two things. In the CVS, check out >>> >>> /MPipes/Video/Deinterlacer >>> >>> It's a very straightforward deinterlacer - it doesn't even check if >>> the image is really interlaced. It's something like 50 lines of > >>> code... >>> >>> Then, if you want to drop frames, you will want to take a look at >>> the framerate changer in >>> >>> /MPipes/Video/ChangeFrameRate >>> >>> This one is a tad bigger (about 250 lines) but it does settings and >>> configuration stuff... You will also notice that the deinterlacer is >>> a MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. You >>> cannot drop or add frames with a MPImageFilterPipe, but it's built >>> on top of the MPSISOPipe... (I'm talking about the base class of >>> these pipes, see the .h) >>> >>> See if you can make sense out of that, and for the remaining >>> questions, I'll be more than happy to give you a hand =) >>> >>> Gaspard >>> >>> On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: >>> >>>> I've already written the de-interlacer for VirtualDub, and I'm >>>> looking to port it to Media Pipe. >>>> >>>> It finds and removes the extra "frame" generated when PAL full >>>> frame is converted to NSTC interlaced. >>>> It takes 1 of ever 4 frames in the PAL sequence and splits it into >>>> A and B, then uses the B from the previous and the A from the next >>>> to make 2 new frames with the A B of the one being interlaced. This >>>> happens pretty often in Anime. >>>> >>>> Thanks for the help! I hope it's a framework in ProjectBuilder! >>>> >>>> Doku >>>> >>>> >>>> >>>> ------------------------------------------------------- >>>> This sf.net email is sponsored by:ThinkGeek >>>> Welcome to geek heaven. >>>> http://thinkgeek.com/sf >>>> _______________________________________________ >>>> MediaPipe-devel mailing list >>>> Med...@li... >>>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>>> >>> >>> >>> >>> ------------------------------------------------------- >>> This sf.net email is sponsored by:ThinkGeek >>> Welcome to geek heaven. >>> http://thinkgeek.com/sf >>> _______________________________________________ >>> MediaPipe-devel mailing list >>> Med...@li... >>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>> >> >> >> >> ------------------------------------------------------- >> This sf.net email is sponsored by:ThinkGeek >> Welcome to geek heaven. >> http://thinkgeek.com/sf >> _______________________________________________ >> MediaPipe-devel mailing list >> Med...@li... >> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >> > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Gaspard P. <mo...@us...> - 2002-10-04 16:10:57
|
On Thursday, October 3, 2002, at 11:48 AM, Viperman wrote: > Ok, well, for some reason my CVS client inside Project Builder isn't > working, won't even let me enable it, but that's probably due to user > error. If there's some kind of setup that I need, please let me know. > I've never done CVS before. I believe that CVS in ProjectBuilder won't work if you use anonymous cvs... ProjectBuilder expects to be able to make modifications to the repository, which you cannot make with anonymous CVS. So for now, I'd say - just disable CVS in ProjectBuilder (it will ask you so) You can download the source code from the CVS using the command line: cvs -d ano...@cv...:/cvsroot/mediapipe/ co . That should checkout everything, including the SDK (it will be in Externs/MediaPipeSDK I think). You should be find the pipes I mentioned and compile them without problems... > > I downloaded the source manually from sf.net and it all looks pretty > straightforward. (thinking about modifying the framerate changer to > support just asking for standard framerates...) All it's missing is > the SDK, so I can't compile anything. Probably going to have to hack > at it for a while, but I've been fighting interlaced signal for about > 6 months now. At least the carbon code is making more sense to me. I > usually write CLI unix telnet servers, so all this graphics stuff is > new to me. However, I also wrote another little filter for VDub you > guys might want that embeds a watermark in the signal (extremely > basic). If I can get a BMP of whatever watermark they want to use I'm > pretty sure I can hash out a watermark/logo embedder (and I probably > will for my own purposes). The interface is the hardest part for me to > write, but I might as well learn Carbon/Cocca while I have projects I > can mathematically handle. The pipes are written in C++ and they use Carbon... but you need Carbon only if you write a GUI... The cool thing is - you don't *have* to write a GUI (see the getSettings/setSettings). The GUI is optional... Watermarking would be easy, but it will be even better in 0.9 where you will be able to say "I want two video input". You take the first, you use the second for watermarking... if the second is shorter, you loop it... if it's an image, it will be repeated on every frame... that way, you need not worry about loading the image and its format... More on this when 0.9 is actually available... Meanwhile, you could also hack something quickly if it's for your own purposes... Have a look at the Deinterlacer and ChangeFrameRate pipes, and I'll be around... you can also send me your code or prototype by email directly if you want me to comment and help you out... > >> On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: >> >> Hello Doku, >> >> Unfortunately, the SDK has been changing too much with the last >> versions of MediaPipe... 0.9 should have the final SDK and it will be >> properly documented. >> >> Meanwhile, you can do two things. In the CVS, check out >> >> /MPipes/Video/Deinterlacer >> >> It's a very straightforward deinterlacer - it doesn't even check if >> the image is really interlaced. It's something like 50 lines of > >> code... >> >> Then, if you want to drop frames, you will want to take a look at the >> framerate changer in >> >> /MPipes/Video/ChangeFrameRate >> >> This one is a tad bigger (about 250 lines) but it does settings and >> configuration stuff... You will also notice that the deinterlacer is >> a MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. You >> cannot drop or add frames with a MPImageFilterPipe, but it's built on >> top of the MPSISOPipe... (I'm talking about the base class of these >> pipes, see the .h) >> >> See if you can make sense out of that, and for the remaining >> questions, I'll be more than happy to give you a hand =) >> >> Gaspard >> >> On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: >> >>> I've already written the de-interlacer for VirtualDub, and I'm >>> looking to port it to Media Pipe. >>> >>> It finds and removes the extra "frame" generated when PAL full frame >>> is converted to NSTC interlaced. >>> It takes 1 of ever 4 frames in the PAL sequence and splits it into A >>> and B, then uses the B from the previous and the A from the next to >>> make 2 new frames with the A B of the one being interlaced. This >>> happens pretty often in Anime. >>> >>> Thanks for the help! I hope it's a framework in ProjectBuilder! >>> >>> Doku >>> >>> >>> >>> ------------------------------------------------------- >>> This sf.net email is sponsored by:ThinkGeek >>> Welcome to geek heaven. >>> http://thinkgeek.com/sf >>> _______________________________________________ >>> MediaPipe-devel mailing list >>> Med...@li... >>> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >>> >> >> >> >> ------------------------------------------------------- >> This sf.net email is sponsored by:ThinkGeek >> Welcome to geek heaven. >> http://thinkgeek.com/sf >> _______________________________________________ >> MediaPipe-devel mailing list >> Med...@li... >> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >> > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Viperman <vip...@us...> - 2002-10-03 15:48:30
|
Ok, well, for some reason my CVS client inside Project Builder isn't working, won't even let me enable it, but that's probably due to user error. If there's some kind of setup that I need, please let me know. I've never done CVS before. I downloaded the source manually from sf.net and it all looks pretty straightforward. (thinking about modifying the framerate changer to support just asking for standard framerates...) All it's missing is the SDK, so I can't compile anything. Probably going to have to hack at it for a while, but I've been fighting interlaced signal for about 6 months now. At least the carbon code is making more sense to me. I usually write CLI unix telnet servers, so all this graphics stuff is new to me. However, I also wrote another little filter for VDub you guys might want that embeds a watermark in the signal (extremely basic). If I can get a BMP of whatever watermark they want to use I'm pretty sure I can hash out a watermark/logo embedder (and I probably will for my own purposes). The interface is the hardest part for me to write, but I might as well learn Carbon/Cocca while I have projects I can mathematically handle. Right now with video I usually make my own movies: <A href=http://overpower.silverden.com/saito.html> http://overpower.silverden.com/saito.html</A> Requires QT6 to view and 1.5Mbps DSL to view without waiting on downloads. So embedding watermarks, de-interlacing, and framerate conversions are essential to me to be able to produce higher quality video. Anyway, I hope I can help advance these pipes as much as I can, since I need em! ;) Thanks for the time. Doku On Thursday, October 3, 2002, at 09:49 AM, Gaspard Petit wrote: > Hello Doku, > > Unfortunately, the SDK has been changing too much with the last > versions of MediaPipe... 0.9 should have the final SDK and it will be > properly documented. > > Meanwhile, you can do two things. In the CVS, check out > > /MPipes/Video/Deinterlacer > > It's a very straightforward deinterlacer - it doesn't even check if > the image is really interlaced. It's something like 50 lines of > code... > > Then, if you want to drop frames, you will want to take a look at the > framerate changer in > > /MPipes/Video/ChangeFrameRate > > This one is a tad bigger (about 250 lines) but it does settings and > configuration stuff... You will also notice that the deinterlacer is a > MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. You > cannot drop or add frames with a MPImageFilterPipe, but it's built on > top of the MPSISOPipe... (I'm talking about the base class of these > pipes, see the .h) > > See if you can make sense out of that, and for the remaining > questions, I'll be more than happy to give you a hand =) > > Gaspard > > On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: > >> I've already written the de-interlacer for VirtualDub, and I'm >> looking to port it to Media Pipe. >> >> It finds and removes the extra "frame" generated when PAL full frame >> is converted to NSTC interlaced. >> It takes 1 of ever 4 frames in the PAL sequence and splits it into A >> and B, then uses the B from the previous and the A from the next to >> make 2 new frames with the A B of the one being interlaced. This >> happens pretty often in Anime. >> >> Thanks for the help! I hope it's a framework in ProjectBuilder! >> >> Doku >> >> >> >> ------------------------------------------------------- >> This sf.net email is sponsored by:ThinkGeek >> Welcome to geek heaven. >> http://thinkgeek.com/sf >> _______________________________________________ >> MediaPipe-devel mailing list >> Med...@li... >> https://lists.sourceforge.net/lists/listinfo/mediapipe-devel >> > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Gaspard P. <mo...@us...> - 2002-10-03 14:51:03
|
Hello Doku, Unfortunately, the SDK has been changing too much with the last versions of MediaPipe... 0.9 should have the final SDK and it will be properly documented. Meanwhile, you can do two things. In the CVS, check out /MPipes/Video/Deinterlacer It's a very straightforward deinterlacer - it doesn't even check if the image is really interlaced. It's something like 50 lines of code... Then, if you want to drop frames, you will want to take a look at the framerate changer in /MPipes/Video/ChangeFrameRate This one is a tad bigger (about 250 lines) but it does settings and configuration stuff... You will also notice that the deinterlacer is a MPImageFilterPipe while the ChangeFrameRate is a MPSISOPipe. You cannot drop or add frames with a MPImageFilterPipe, but it's built on top of the MPSISOPipe... (I'm talking about the base class of these pipes, see the .h) See if you can make sense out of that, and for the remaining questions, I'll be more than happy to give you a hand =) Gaspard On Wednesday, October 2, 2002, at 08:49 PM, Viperman wrote: > I've already written the de-interlacer for VirtualDub, and I'm looking > to port it to Media Pipe. > > It finds and removes the extra "frame" generated when PAL full frame > is converted to NSTC interlaced. > It takes 1 of ever 4 frames in the PAL sequence and splits it into A > and B, then uses the B from the previous and the A from the next to > make 2 new frames with the A B of the one being interlaced. This > happens pretty often in Anime. > > Thanks for the help! I hope it's a framework in ProjectBuilder! > > Doku > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > MediaPipe-devel mailing list > Med...@li... > https://lists.sourceforge.net/lists/listinfo/mediapipe-devel > |
From: Viperman <vip...@us...> - 2002-10-03 00:49:31
|
I've already written the de-interlacer for VirtualDub, and I'm looking to port it to Media Pipe. It finds and removes the extra "frame" generated when PAL full frame is converted to NSTC interlaced. It takes 1 of ever 4 frames in the PAL sequence and splits it into A and B, then uses the B from the previous and the A from the next to make 2 new frames with the A B of the one being interlaced. This happens pretty often in Anime. Thanks for the help! I hope it's a framework in ProjectBuilder! Doku |
From: Gaspard P. <mo...@us...> - 2002-07-23 04:50:02
|
Hello my quiet fellows... This email just to inform you that MediaPipe 0.8.7 is out (see http://mediapipe.sourceforge.net/Forum/viewtopic.php?p=145#145 for the public announcement) As far as development is concerned, I will be going in a phase similar to the 0.7->0.8 (which took almost 2 months). I expect to work at least one month before 0.9.0 can be released. If you are planning on writing pipes, please inform me, so that I can tell you what is going to change in the SDK for the next release... Gaspard |
From: Gaspard P. <mo...@us...> - 2002-07-05 07:44:06
|
Hello, I'll break the silence to announce the MediaPipe 0.8.5 is out. I will be out of town for the next few days, so if there are problems, Makira is in charge... Gaspard |