From: Burkhard P. <pl...@ip...> - 2004-09-14 12:09:39
|
Hi, > We want to make an API that supports extra capabilities depending on > the display target being used (plugable). That sounds good. > When an app notifies libvisual that it ONLY supports 16 bit and lower > libvisual will transform 24 and 32 bits surfaces to 16 bit internally. So it seems that apps still need to know a lot about the display properties. For xine, mplayer and such, this would be no problem, but if someone wants just to write a quick and dirty visualisation app (like gmerlin-visualizer)? > Like SDL did kinda ? Probably, but as I said, I don't really like (i.e. know) SDL. glut and gtkglarea are also exmaples, where apps can pass the required OpenGL properties. > I rather have the burdon on libvisual, because that way we don't force > anything. An example is video editing. There is a program LiVES > (lives.sf.net) If I'm not wrong that is used for VJ productions. Since > ehm, this weekend it also supports libvisual, but it's not drawn > in realtime because of al the video stuff. Basicly we want to take > as much as possible out of the plugins. Keeping plugins simple is good. The problem is that for lemuria, time == frame_count. Many effect mechanisms (e.g. the animated texture on most foreground objects) rely on a constant and known framerate. If you do completely synchronous rendering (i.e. render one picture each 512 samples), you must render 44100/512 = 86 fps for lemuria. I don't think, that a "standard" consumer PC is able to do this, when full antialiasing is enabled :-) In addition, OpenGL (i.e. GLXSwapBuffers) can be synched to the vertical frequency of the monitor, and for the high-speed scenes in lemuria, this is strongly recommended. Being in sync with both the music and the monitor is impossible. In addition, the animation speed would change, if one switches to a track with a different samplerate. For spectrum analyzers, this doesn't matter but for lemuria it does! I meditated a lot about this when writing lemuria. The only possible solution was to render in a separate thread using the most recent audio samples and limit the framerate using a software timer (to ca. 30 fps in my case). On weak hardware, the animation speed becomes slower, but it's always independent from the audio signal. Whether the plugins or libvisual handles this is a matter of taste, but completely synchronous rendering is not possible. On the other hand, if I use goom for making textures, I also use synchronous rendering (pass audio samples and get a picture). For goom2k4, I no longer use the xmms plugin but the goom library. >>If libvisual handles all this, the best would be to start an own >>thread for each plugin. In addition, each plugin should have it's >>own Display* connection to completely avoid X11/multithread >>related crashes. > > Keep in mind, that we really don't want to just depend on X11, it should > work everywhere, framebuffer, gstreamer, xine, whatever you'd like. I know, I was only talking about the X11 case (the most interesting for me). > Well, since a libvisual pipeline is a synchronous stream I don't > see the reason for threading. I do (see above) :-) >>One small thing: make DESTDIR=<some_dir> does not work. >>It's needed when you want to make rpms. > > How do I solve that, Duilio is doing most of the auto* stuff :) The problem is the install-exec-local target. There is an explicit reference to $(prefix)/include/libvisual, which isn't changed by the DESTDIR mechanism. Maybe using $(includedir) would help. Cheers Burkhard -- _____________________________ Dr.-Ing. Burkhard Plaum Institut fuer Plasmaforschung Pfaffenwaldring 31 70569 Stuttgart Tel.: +49 711 685-2187 Fax.: -3102 |
From: Burkhard P. <pl...@ip...> - 2004-09-15 11:35:36
|
> The application has final control, so I was thinking about the plugin. > We can have a flag within the VisActorPlugin structure > like .fps_preferred . Ok, that's good. > But what is the usage, of trying to draw frames, if it's not getting > displayed anyway ? You're right, that should be avoided. > In the case that you get anough audio samples for > a high framerate, what is the use of asynchronous rendering? On weaker CPUs, one visualization plugin might slow down the whole process (in the worst case, also the audio playback). Imagine a player, which does something like: while(1) { get_next_samples(); send_samples_to_soundcard(); send_samples_to_libvisual(); } If send_samples_to_libvisual() renders and displays everything synchronously, the soundcard buffer might underrun, which won't sound that good. So at least at one point between the playback loop (which drives the soundcard) and the visualization, things should be decoupled. If it's in the application, inside libvisual, or inside the plugin, doesn't matter. Xmms says, that the render_* functions shouldn't take long (for the reason described above). This implies, that plugins, which are more CPU intensive, should be threaded. >> this doesn't interfer with syncing OpenGL to the monitor refresh rate >> it could be ok. > It won't because that is in glx it's hands. (right ?) It's enabled via an enviroment variable (for the NVidia drivers at least). The application cannot control the Gl<->Monitor sync. > We will surely make an good API that can let plugins request any format > they want! Ok. > We're creating this, through improving libvisual and introducing > libvisual-display. The libvisual-display seems to be crucial. Keep me updated, when there is something in CVS. >>A workaround for this could be to copy the OpenGL framebuffer into >>system memory and use the same display methods as for 2D plugins. >>Maybe the PCs in 2010 can do this at 1280x1024 :-) > > Heheh, well, that is a dirty solution ofcourse, might be even > good for those video editing stuff, but not for what we want ! > but you know that as well lol. Actually, lemuria does this for it's 256x256 texture: Copy the framebuffer into a texture and into memory, exchange some colors and copy it back into a second texture. Because sometimes the foreground and background have the same texture image, only the colors are swapped. The funny thing is, that most people think, they are completely different images :-) -- _____________________________ Dr.-Ing. Burkhard Plaum Institut fuer Plasmaforschung Pfaffenwaldring 31 70569 Stuttgart Tel.: +49 711 685-2187 Fax.: -3102 |
From: Dennis S. <sy...@yo...> - 2004-09-15 11:59:42
|
On Wed, 2004-09-15 at 13:38 +0200, Burkhard Plaum wrote: > > In the case that you get anough audio samples for > > a high framerate, what is the use of asynchronous rendering? > > On weaker CPUs, one visualization plugin might slow down the whole > process (in the worst case, also the audio playback). Imagine a player, > which does something like: > > while(1) > { > get_next_samples(); > send_samples_to_soundcard(); > send_samples_to_libvisual(); > } > > If send_samples_to_libvisual() renders and displays everything > synchronously, the soundcard buffer might underrun, which won't sound > that good. > So at least at one point between the playback loop (which drives the > soundcard) and the visualization, things should be decoupled. > > If it's in the application, inside libvisual, or inside the plugin, > doesn't matter. In this setup, you use a audio callback in the libvisual pipeline which semi automaticly retrieves samples. Using that + have helper functions to run the complete libvisual pipeline in a thread and this issue is solved. > > We're creating this, through improving libvisual and introducing > > libvisual-display. > > The libvisual-display seems to be crucial. Keep me updated, when > there is something in CVS. Cool, watch the list, bits will coming the upcoming weeks. > Actually, lemuria does this for it's 256x256 texture: Copy the > framebuffer into a texture and into memory, exchange some colors > and copy it back into a second texture. > Because sometimes the foreground and background have > the same texture image, only the colors are swapped. Well, ok but here it's a small texture and for special purpose. > The funny thing is, that most people think, they are completely > different images :-) Hehehehe! Cheers, Dennis |
From: Dennis S. <sy...@yo...> - 2004-09-14 19:11:47
|
On Tue, 2004-09-14 at 14:12 +0200, Burkhard Plaum wrote: > > When an app notifies libvisual that it ONLY supports 16 bit and lower > > libvisual will transform 24 and 32 bits surfaces to 16 bit internally. > > So it seems that apps still need to know a lot about the > display properties. For xine, mplayer and such, this would be > no problem, but if someone wants just to write a quick and dirty > visualisation app (like gmerlin-visualizer)? Well, how can an app take the best depth, without knowing it ?. You can FORCE a plugin in a certain depth, it will get depth transformed within libvisual. But, when possible it's adviced to try atleast take the right depth. This reduces overhead. > > Like SDL did kinda ? > > Probably, but as I said, I don't really like (i.e. know) SDL. > glut and gtkglarea are also exmaples, where apps can pass the > required OpenGL properties. Ok :) We will look at that for sure. > > I rather have the burdon on libvisual, because that way we don't force > > anything. An example is video editing. There is a program LiVES > > (lives.sf.net) If I'm not wrong that is used for VJ productions. Since > > ehm, this weekend it also supports libvisual, but it's not drawn > > in realtime because of al the video stuff. Basicly we want to take > > as much as possible out of the plugins. > > Keeping plugins simple is good. > The problem is that for lemuria, time == frame_count. > Many effect mechanisms (e.g. the animated texture on most > foreground objects) rely on a constant and known framerate. What about a system that tells libvisual, 'I'd REALLY like THIS framerate' and then libvisual implements an easy to use framerate limiter. I think this would be better because you don't FORCE it, this can be better regarding flexibility in some areas. Keep in mind that we don't just write libvisual for playback apps. Some guys are doing VJ software combined with realtime video software with it. > If you do completely synchronous rendering (i.e. render one picture > each 512 samples), you must render 44100/512 = 86 fps for lemuria. > I don't think, that a "standard" consumer PC is able to do this, > when full antialiasing is enabled :-) Most plugins don't obtain this framerate, it's not a problem that not each sample a picture is generated. > In addition, OpenGL (i.e. GLXSwapBuffers) can be synched to > the vertical frequency of the monitor, and for the high-speed > scenes in lemuria, this is strongly recommended. > Being in sync with both the music and the monitor is impossible. > > In addition, the animation speed would change, if one switches to > a track with a different samplerate. For spectrum analyzers, this > doesn't matter but for lemuria it does! As said, we can use a FPS limiter provided by libvisual, or wouldn't that be sufficient ? > I meditated a lot about this when writing lemuria. The only possible > solution was to render in a separate thread using the > most recent audio samples and limit the framerate using a > software timer (to ca. 30 fps in my case). On weak hardware, > the animation speed becomes slower, but it's always independent > from the audio signal. The audio core gets redone eventually, so don't rely too much on this '512' thing. > Whether the plugins or libvisual handles this is a matter of taste, > but completely synchronous rendering is not possible. What is needed, in facilities, within libvisual to make it possible ? (as you can guess by now, I only use threads when it's absolutely needed, no way around it). Ofcourse as a plugin writer, it's your choice. But I'd rather make some facilities in libvisual to work it out threadlessly, than using threads. > The problem is the install-exec-local target. There is an > explicit reference to $(prefix)/include/libvisual, which isn't > changed by the DESTDIR mechanism. Maybe using $(includedir) would > help. Duilio can you take a look at this ? Burkhard Thanks a lot for your input, I'm sure we can find the technical right solution, discussion is good it makes us think :) Cheers, Dennis |
From: Burkhard P. <pl...@ip...> - 2004-09-15 10:06:11
|
> Well, how can an app take the best depth, without knowing it ?. Maybe because libvisual handles the Window/Visual/depth stuff? It's nice if visualization plugins can be window system independent. But IMHO it would even be better if applications also don't need to care about it. I didn't dig too deep into libvisual yet, so maybe I sound stupid. > What about a system that tells libvisual, > 'I'd REALLY like THIS framerate' and then libvisual implements > an easy to use framerate limiter. Who tells it, the application or the vis plugin? > Some guys are doing VJ software > combined with realtime video software with it. Ok, then we need the framerate forced by the application. > Most plugins don't obtain this framerate, it's not a problem that > not each sample a picture is generated. Exactly :-) And this is, where it gets asynchronous! > As said, we can use a FPS limiter provided by libvisual, or wouldn't > that be sufficient ? If this doesn't interfer with syncing OpenGL to the monitor refresh rate it could be ok. > The audio core gets redone eventually, so don't rely too much on this > '512' thing. Oops, the 512 samples are hardcoded everywhere inside lemuria and many other plugins. It's in the xmms API (and thus also in winamp I would guess). Changing visualization plugins for handling arbitrary audio frame sizes would normally result in buffering samples until there are 512 of them. > What is needed, in facilities, within libvisual to make it possible ? > (as you can guess by now, I only use threads when it's absolutely > needed, no way around it). Ofcourse as a plugin writer, it's your > choice. But I'd rather make some facilities in libvisual to work > it out threadlessly, than using threads. Ok, IMHO there need to be at least 2 different layers. One, which handles the pcm -> picture stuff synchronously and threadless. The other one handles the window/displaying/framerate stuff. For 2D plugins like goom, these two can completely be separated. You can make a picture even without knowing into which window it will be displayed later on. The problem is for OpenGL, because rendering a picture looks like (speaking for GLX only): Window window; Display * display; GLXContext glxcontext; glXMakeCurrent(display, window, glxcontext); /* Do all the OpenGL functions */ ... /* */ glXSwapBuffers(display, window); And glXSwapBuffers can (and should) be synched to the monitor as I said. A workaround for this could be to copy the OpenGL framebuffer into system memory and use the same display methods as for 2D plugins. Maybe the PCs in 2010 can do this at 1280x1024 :-) I can't hack a lot for libvisual due to lack of time, but I would really like to discuss APIs for the display stuff when they get written. I think libvisual is a good approach, but It will cost some brain cycles to make it meet all requirements :-) -- _____________________________ Dr.-Ing. Burkhard Plaum Institut fuer Plasmaforschung Pfaffenwaldring 31 70569 Stuttgart Tel.: +49 711 685-2187 Fax.: -3102 |
From: Dennis S. <sy...@yo...> - 2004-09-15 10:44:14
|
On Wed, 2004-09-15 at 12:05 +0200, Burkhard Plaum wrote: > > Well, how can an app take the best depth, without knowing it ?. > > Maybe because libvisual handles the Window/Visual/depth > stuff? Libvisual handles the visual/depth. NOT THE WINDOWS. that is the whole point. Because of this you can literally draw a visual anywhere. But because we don't want the app to handle windowing itself, we are planning to introduce libvisual-display. To solve this issue, and have more control! > It's nice if visualization plugins can be window system independent. > But IMHO it would even be better if applications also don't need to > care about it. > I didn't dig too deep into libvisual yet, so maybe I sound stupid. I agree with you, this is why we plan libvisual-display. But it needs to be coded, and we aren't with many people :) > > What about a system that tells libvisual, > > 'I'd REALLY like THIS framerate' and then libvisual implements > > an easy to use framerate limiter. > > Who tells it, the application or the vis plugin? The application has final control, so I was thinking about the plugin. We can have a flag within the VisActorPlugin structure like .fps_preferred . > > Some guys are doing VJ software > > combined with realtime video software with it. > > Ok, then we need the framerate forced by the application. Optionally, forced, yes OPTIONALLY :) Also this LiVES stuff uses only framebuffer plugins, because it mixes with video clips. > > Most plugins don't obtain this framerate, it's not a problem that > > not each sample a picture is generated. > > Exactly :-) > And this is, where it gets asynchronous! But what is the usage, of trying to draw frames, if it's not getting displayed anyway ? In the case that you get anough audio samples for a high framerate, what is the use of asynchronous rendering ? And if you can't obtain the framerate, what does asynchronous help in this ? > > As said, we can use a FPS limiter provided by libvisual, or wouldn't > > that be sufficient ? > > If this doesn't interfer with syncing OpenGL to the monitor refresh rate > it could be ok. It won't because that is in glx it's hands. (right ?) > > The audio core gets redone eventually, so don't rely too much on this > > '512' thing. > > Oops, the 512 samples are hardcoded everywhere inside lemuria and many > other plugins. It's in the xmms API (and thus also in winamp I would guess). We will surely make an good API that can let plugins request any format they want! > > What is needed, in facilities, within libvisual to make it possible ? > > (as you can guess by now, I only use threads when it's absolutely > > needed, no way around it). Ofcourse as a plugin writer, it's your > > choice. But I'd rather make some facilities in libvisual to work > > it out threadlessly, than using threads. > > Ok, IMHO there need to be at least 2 different layers. One, > which handles the pcm -> picture stuff synchronously and > threadless. The other one handles the window/displaying/framerate > stuff. We're creating this, through improving libvisual and introducing libvisual-display. > For 2D plugins like goom, these two can completely be separated. > You can make a picture even without knowing into which window > it will be displayed later on. Yep, for OpenGL this is a different case ofcourse > The problem is for OpenGL, because rendering a picture looks like > (speaking for GLX only): > ..... > And glXSwapBuffers can (and should) be synched to the monitor as > I said. Yep. > A workaround for this could be to copy the OpenGL framebuffer into > system memory and use the same display methods as for 2D plugins. > Maybe the PCs in 2010 can do this at 1280x1024 :-) Heheh, well, that is a dirty solution ofcourse, might be even good for those video editing stuff, but not for what we want ! but you know that as well lol. > I can't hack a lot for libvisual due to lack of time, > but I would really like to discuss APIs for the display stuff > when they get written. That is ok, Discussion APIs is contribution. Two know more than one so to say. > I think libvisual is a good approach, but It will cost some > brain cycles to make it meet all requirements :-) That is why it's good to have many people on the list, so we can distribute those brain cycles *grin*.. Thanks for all your comments, it's sure worth stuff! Cheers, Dennis |