From: Robert O. <ro...@op...> - 2005-02-01 13:47:19
|
Hi Matthias, > Robert, > > why not download frame data to textures directly? IMHO that would be a > more efficient approach. If you want to be at least soft-realtime you > need all the performance you can get. What do you exactly mean here when you say download frame data to textures directly. What frame data are were talking about here? Who creates the frame data? Note the front and back buffers that OpenGL is managing are of a 3D scene not of a 2D video feed. > > I've looked at the existing opengl plugin, but this way over complicated > > for what I'm expecting to require - the OSG has lots of OpenGL > > functionality all > > Over complicated with respect to what? The OpenGL plugin does alot of OpenGL setup and calls that are completely not required for an equivilant OSG plugin as its the OSG that can do the vast majority of this work. > Please note that xine is a multithreaded application, and the video > plugins is called from several different threads - even if the plugin is > just called for rendering (not setup, etc.) it can be from different > trehads (main loop vs. expose events). The OSG is multi-threaded too so I'm familiar with the issues of managing multi-thread apps. > This makes for most of the more complicated stuff in the plugin. I had > to create my own render thread and use IPC for coordination. The > rendering routines are almost trivial. > > You have to make sure that the same thread does OSG rendering and > texture upload. The OSG will have its own threads, for the xine plugin I'll just need to have a thread which manages the a set of 2d RGB images that the OSG rendering thread can read on demand. I'm expect to at least double buffer the 2d RGB images so that one can be written to while the OSG's thread can read from the other safely without tearing. I've implemented something similar with integrating with libmpeg3, but libmpeg3 is a quite a bit simplier in terms of usage since you explicitly call it to get the image for each frame and it doesn't have a plugin architecture which xine-lib has. > > > ready to be used, so I'm curious about an other existing plugin that > > would better fit the profile of usage that I'll need. Can any one > > recommend a bare bones video-out plugin that might serve as a good > > example from which to base my own plugin on. > > Depends on what you want to do. If you want yuv->rgb conversion and > image scaling I suggest using the xshm module. If you want to get raw > yuv or yuy2 data, use the xv plugin. If you want conversion but no > scaling I would (how come ;) suggest the OpenGL plugin. I basically > merged parts from the xshm and the xv plugin to get a non-scaling > yuv-converting plugin. I am curious about using YUV OpenGL texture formats under OSX and fragment programs for other platforms for doing the colour space conversion, so I may well be able to take un scaled YUV data. However, for a first pass I'll stick to having the yuv-rgb conversion done by xine-lib and no scaling as this typically won't be needed when doing OpenGL as the graphics pipeline scales for you any way. In some ways the OpenGL plugin is a good match, but I believe way over complicated relative to my own needs. Possibly the biggest challange is wiring up an OSG plugin to xine-lib in such way as xine-lib just uses locally defined callbacks rather than needing just another video-out plugin that requires some fancy footwork to keep it wired up with the original OSG plugin that was loaded to doing support video reading. I've been off on another task so far this week (volume rendering of MRI scans) , but tomorrow I'll be back on case of getting an xine-lib based OSG plugin working, perhaps more question will surface then. Hopefully by the end of the week I'll have some fun screenshots to share. Robert. |