From: Dennis S. <sy...@yo...> - 2004-09-14 19:11:47
|
On Tue, 2004-09-14 at 14:12 +0200, Burkhard Plaum wrote: > > When an app notifies libvisual that it ONLY supports 16 bit and lower > > libvisual will transform 24 and 32 bits surfaces to 16 bit internally. > > So it seems that apps still need to know a lot about the > display properties. For xine, mplayer and such, this would be > no problem, but if someone wants just to write a quick and dirty > visualisation app (like gmerlin-visualizer)? Well, how can an app take the best depth, without knowing it ?. You can FORCE a plugin in a certain depth, it will get depth transformed within libvisual. But, when possible it's adviced to try atleast take the right depth. This reduces overhead. > > Like SDL did kinda ? > > Probably, but as I said, I don't really like (i.e. know) SDL. > glut and gtkglarea are also exmaples, where apps can pass the > required OpenGL properties. Ok :) We will look at that for sure. > > I rather have the burdon on libvisual, because that way we don't force > > anything. An example is video editing. There is a program LiVES > > (lives.sf.net) If I'm not wrong that is used for VJ productions. Since > > ehm, this weekend it also supports libvisual, but it's not drawn > > in realtime because of al the video stuff. Basicly we want to take > > as much as possible out of the plugins. > > Keeping plugins simple is good. > The problem is that for lemuria, time == frame_count. > Many effect mechanisms (e.g. the animated texture on most > foreground objects) rely on a constant and known framerate. What about a system that tells libvisual, 'I'd REALLY like THIS framerate' and then libvisual implements an easy to use framerate limiter. I think this would be better because you don't FORCE it, this can be better regarding flexibility in some areas. Keep in mind that we don't just write libvisual for playback apps. Some guys are doing VJ software combined with realtime video software with it. > If you do completely synchronous rendering (i.e. render one picture > each 512 samples), you must render 44100/512 = 86 fps for lemuria. > I don't think, that a "standard" consumer PC is able to do this, > when full antialiasing is enabled :-) Most plugins don't obtain this framerate, it's not a problem that not each sample a picture is generated. > In addition, OpenGL (i.e. GLXSwapBuffers) can be synched to > the vertical frequency of the monitor, and for the high-speed > scenes in lemuria, this is strongly recommended. > Being in sync with both the music and the monitor is impossible. > > In addition, the animation speed would change, if one switches to > a track with a different samplerate. For spectrum analyzers, this > doesn't matter but for lemuria it does! As said, we can use a FPS limiter provided by libvisual, or wouldn't that be sufficient ? > I meditated a lot about this when writing lemuria. The only possible > solution was to render in a separate thread using the > most recent audio samples and limit the framerate using a > software timer (to ca. 30 fps in my case). On weak hardware, > the animation speed becomes slower, but it's always independent > from the audio signal. The audio core gets redone eventually, so don't rely too much on this '512' thing. > Whether the plugins or libvisual handles this is a matter of taste, > but completely synchronous rendering is not possible. What is needed, in facilities, within libvisual to make it possible ? (as you can guess by now, I only use threads when it's absolutely needed, no way around it). Ofcourse as a plugin writer, it's your choice. But I'd rather make some facilities in libvisual to work it out threadlessly, than using threads. > The problem is the install-exec-local target. There is an > explicit reference to $(prefix)/include/libvisual, which isn't > changed by the DESTDIR mechanism. Maybe using $(includedir) would > help. Duilio can you take a look at this ? Burkhard Thanks a lot for your input, I'm sure we can find the technical right solution, discussion is good it makes us think :) Cheers, Dennis |