From: Marco L. <ml...@cs...> - 2004-08-25 12:49:21
|
Hi there, from the feedback we got at the KDE conference I think one important thing for KDE multimedia developers (as well as for other multimedia developers) would be to have some comparison between MAS, GStreamer, and NMM. We would like to start this comparison with a look at two things: (1) the programming models (i.e. what code you have to write down to achieve certain things) and (2) the APIs provided by the different frameworks (in correspondance to Matthias Ettrich's statement "the API is to the programmer what a GUI is to the end user"). Therefore, we would suggest that for every project the source code for a number of helloworld programs is provided. The first program (helloworld I) should demonstrate how following features can be accessed: (1) Allow the playback of an encoded audio file (e.g. MP3). This will result in similar setups: a component for reading data from a file connected to a component for decoding connected to a component for audio output. (Together, this is called "pipeline" or "flow graph"). (2) Set the filename of the file to be read. (3) Manually request/setup this functionality, i.e. no automatic setup of flow graphs. (4) Include some error handling. In a second step, we would like to extend the helloworld program with following feature (helloworld II): (1) Add a listener that gets notified if the currently playing file has ended, i.e. this listener is to be triggered after the last byte was played by the audio device. In a final step, we would like to extened the helloworld program (helloworld I) to allow for distributed playback (helloworld III): (1) The component for reading data from a file should be located on the local host. The component for decoding, and playing the audio data should be located on remote host. Notice that this third example should also demonstrate how easy (or painful) it is to develop networked multimedia applications using the particular framework. We hope that this will finally show that developing distributed multimedia applications means more than "well, simply write a component for streaming data and put that into your pipeline". We think that these three examples provide typical features needed for developing multimedia applications. Furthermore, these features are simple enough to be provided for all three frameworks. Therefore, we hope that the developers of MAS and Gstreamer are willing to participate in this comparison. For NMM, see following links: helloworld I http://graphics.cs.uni-sb.de/NMM/current/Docs/helloworld/x62.html helloworld II http://graphics.cs.uni-sb.de/NMM/current/Docs/helloworld/x122.html helloworld III http://graphics.cs.uni-sb.de/NMM/current/Docs/helloworld/x145.html We also set up a page at http://graphics.cs.uni-sb.de/NMM/comparison.html that should link to the examples provided for MAS and GStreamer. Have fun, Marco. P.S. If you think this email should be forwarded to other mailing-lists, feel free to do so, but please cc me so that I am informed about further discussions. |
From: Thomas V. S. <th...@ap...> - 2004-08-25 13:58:49
|
Hi Marco, > (1) Allow the playback of an encoded audio file (e.g. MP3). This will > result in similar setups: a component for reading data from a > file connected to a component for decoding connected to a component > for audio output. (Together, this is called "pipeline" or "flow > graph"). > (2) Set the filename of the file to be read. > (3) Manually request/setup this functionality, i.e. no automatic setup > of flow graphs. > (4) Include some error handling. I'm curious about (3) - why should it not be done automatically ? So you're saying you just want an application that can only play mp3's ? Personally I think a much better test would be to have a helloworld that can take a media file and just play it, whatever type it is. That's what users care about, anyway. > In a second step, we would like to extend the helloworld program with > following feature (helloworld II): > > (1) Add a listener that gets notified if the currently playing file > has ended, i.e. this listener is to be triggered after the last byte > was played by the audio device. What sort of thing is your listener ? An in-program function callback ? Another process ? Something else ? > In a final step, we would like to extened the helloworld program > (helloworld I) to allow for distributed playback (helloworld III): > > (1) The component for reading data from a file should be located on the > local host. The component for decoding, and playing the audio data should > be located on remote host. > > Notice that this third example should also demonstrate how easy (or > painful) it is to develop networked multimedia applications using the > particular framework. We hope that this will finally show that > developing distributed multimedia applications means more than "well, > simply write a component for streaming data and put that into your > pipeline". Not sure why the third is important. While it's important for multimedia frameworks to be able to do things like this, I don't see the value of this for a desktop environment. Can you provide a use case where this makes sense ? Also, I don't think it's smart to do this for audio only. We all know audio is the easiest to get right anyway, and audio presents a lot less challenge to frameworks. I'm sure I could come up with other things that are important to be tested, I'll think about it some more. Thomas Dave/Dina : future TV today ! - http://www.davedina.org/ <-*- thomas (dot) apestaart (dot) org -*-> If you want love we'll make it <-*- thomas (at) apestaart (dot) org -*-> URGent, best radio on the net - 24/7 ! - http://urgent.fm/ |
From: Steve L. <ste...@fr...> - 2004-08-25 14:18:39
|
Thomas Vander Stichele a =E9crit : > Hi Marco, >=20 >=20 >>(1) Allow the playback of an encoded audio file (e.g. MP3). This will >>result in similar setups: a component for reading data from a >>file connected to a component for decoding connected to a component >>for audio output. (Together, this is called "pipeline" or "flow >>graph"). >>(2) Set the filename of the file to be read. >>(3) Manually request/setup this functionality, i.e. no automatic setup >>of flow graphs. >>(4) Include some error handling. >=20 >=20 > I'm curious about (3) - why should it not be done automatically ? So > you're saying you just want an application that can only play mp3's ? > Personally I think a much better test would be to have a helloworld tha= t > can take a media file and just play it, whatever type it is. That's > what users care about, anyway. Well. I think there should be 2 examples. One for audio-only and one for=20 "multimedia" in general. That's usually what users tend to use. Also you should take meta-information in consideration, like the=20 title/artist of a song/movie, the instant/avergae bitrate of streams,=20 etc. Any user would like this kind of information. |
From: Matthias K. <kr...@kd...> - 2004-08-25 14:41:24
|
On Wednesday 25 August 2004 15:58, Thomas Vander Stichele wrote: > Hi Marco, > > > (1) Allow the playback of an encoded audio file (e.g. MP3). This will > > result in similar setups: a component for reading data from a > > file connected to a component for decoding connected to a component > > for audio output. (Together, this is called "pipeline" or "flow > > graph"). > > (2) Set the filename of the file to be read. > > (3) Manually request/setup this functionality, i.e. no automatic setup > > of flow graphs. > > (4) Include some error handling. > > I'm curious about (3) - why should it not be done automatically ? Both should be possible. In case you don't want it to be done behind your=20 back, when you want to have a special setup... > So=20 > you're saying you just want an application that can only play mp3's ? > Personally I think a much better test would be to have a helloworld that > can take a media file and just play it, whatever type it is. That's > what users care about, anyway. This not about writing an especially usefull app but about looking how the = API=20 "feels". While I find it interesting to see how to set up the framework to= =20 play a mp3 file it's of course also interesting how much the framework can = do=20 on it's own. But... this is not really about comparing features, it's more about compari= ng=20 APIs. > > In a second step, we would like to extend the helloworld program with > > following feature (helloworld II): > > > > (1) Add a listener that gets notified if the currently playing file > > has ended, i.e. this listener is to be triggered after the last byte > > was played by the audio device. > > What sort of thing is your listener ? An in-program function callback ? > Another process ? Something else ? Any kind of listener. If in C you do it with a callback just show the code = how=20 to do it. In Qt/KDE you'd just want to connect a slot to a signal. > > In a final step, we would like to extened the helloworld program > > (helloworld I) to allow for distributed playback (helloworld III): > > > > (1) The component for reading data from a file should be located on the > > local host. The component for decoding, and playing the audio data shou= ld > > be located on remote host. > > > > Notice that this third example should also demonstrate how easy (or > > painful) it is to develop networked multimedia applications using the > > particular framework. We hope that this will finally show that > > developing distributed multimedia applications means more than "well, > > simply write a component for streaming data and put that into your > > pipeline". > > Not sure why the third is important. While it's important for > multimedia frameworks to be able to do things like this, I don't see the > value of this for a desktop environment. Can you provide a use case > where this makes sense ? Thin clients, conferencing, cool stuff like shown in the NMM talk at=20 aKademy ;-) > Also, I don't think it's smart to do this for audio only. We all know > audio is the easiest to get right anyway, and audio presents a lot less > challenge to frameworks. Uh, I don't agree that audio is the easier part. Video is easier.=20 Synchronization isn't easy, though. > I'm sure I could come up with other things that are important to be > tested, I'll think about it some more. Again, it's not about a testsuite or features but about the API. =2D-=20 C'ya Matthias ________________________________________________________ Matthias Kretz (Germany) <>< http://Vir.homeip.net/ Mat...@gm..., kr...@kd..., Mat...@ur... |
From: Christian F. K. S. <ur...@li...> - 2004-08-25 16:05:05
|
On Wed, 2004-08-25 at 16:40, Matthias Kretz wrote: > On Wednesday 25 August 2004 15:58, Thomas Vander Stichele wrote: > > Hi Marco, > > > > > (1) Allow the playback of an encoded audio file (e.g. MP3). This will > > > result in similar setups: a component for reading data from a > > > file connected to a component for decoding connected to a component > > > for audio output. (Together, this is called "pipeline" or "flow > > > graph"). > > > (2) Set the filename of the file to be read. > > > (3) Manually request/setup this functionality, i.e. no automatic setup > > > of flow graphs. > > > (4) Include some error handling. > > > > I'm curious about (3) - why should it not be done automatically ? > > Both should be possible. In case you don't want it to be done behind your > back, when you want to have a special setup... > > > So > > you're saying you just want an application that can only play mp3's ? > > Personally I think a much better test would be to have a helloworld that > > can take a media file and just play it, whatever type it is. That's > > what users care about, anyway. > > This not about writing an especially usefull app but about looking how the API > "feels". While I find it interesting to see how to set up the framework to > play a mp3 file it's of course also interesting how much the framework can do > on it's own. > But... this is not really about comparing features, it's more about comparing > APIs. I would think this is the least interesting thing to look at in this setting cause if the Qt style bindings to GStreamer (or if someone made Qt style bindings for MAS) isn't exactly what you want you can change them to fit what you want in the coming year. In his talk at the conference Mathias Ettrich said that in some was Qt was basically a binding to a lot of different C libraries, I remember he mentioned freetype as one. So basically you should be able to get a Qt binding that is just as good and comfortable as the rest of Qt for you to program in and with. The important things for KDE to evaluate IMHO, especially considering that KDE 4.0 is just a year a way is features, maturity, interoperability, viability and cross-platform support. > > > In a second step, we would like to extend the helloworld program with > > > following feature (helloworld II): > > > > > > (1) Add a listener that gets notified if the currently playing file > > > has ended, i.e. this listener is to be triggered after the last byte > > > was played by the audio device. > > > > What sort of thing is your listener ? An in-program function callback ? > > Another process ? Something else ? > > Any kind of listener. If in C you do it with a callback just show the code how > to do it. > In Qt/KDE you'd just want to connect a slot to a signal. > > > > In a final step, we would like to extened the helloworld program > > > (helloworld I) to allow for distributed playback (helloworld III): > > > > > > (1) The component for reading data from a file should be located on the > > > local host. The component for decoding, and playing the audio data should > > > be located on remote host. > > > > > > Notice that this third example should also demonstrate how easy (or > > > painful) it is to develop networked multimedia applications using the > > > particular framework. We hope that this will finally show that > > > developing distributed multimedia applications means more than "well, > > > simply write a component for streaming data and put that into your > > > pipeline". > > > > Not sure why the third is important. While it's important for > > multimedia frameworks to be able to do things like this, I don't see the > > value of this for a desktop environment. Can you provide a use case > > where this makes sense ? > > Thin clients, conferencing, cool stuff like shown in the NMM talk at > aKademy ;-) Yes, but I am not so sure that in most of these cases having the media framework on both sides is the right solution. For the thin-client scenario for example I think having a lightweight limited functionality media server client is the better solution and then have the media framework on the other end do the needed transcodings and adaptions in order for that light client to be able to play/display what you are sending it. My way of solving the exact demo done at the conference would be to have a deamon running which offered a rendevouz/zeroconf services which it broadcast over bluetooth/wlan. When you then arrive with your handheld which has bluetooth/wlan it would inform the owner that X number of services was available and ask if she/he wants to transfer the stream to any of them. I guess you don't really want to be playing from your PDA when you get home anyway, so I guess adding some functionality which transfered the playback to the sound collection on your central server/machine would be the final solution to the issue I guess. > > Also, I don't think it's smart to do this for audio only. We all know > > audio is the easiest to get right anyway, and audio presents a lot less > > challenge to frameworks. > > Uh, I don't agree that audio is the easier part. Video is easier. > Synchronization isn't easy, though. > > > I'm sure I could come up with other things that are important to be > > tested, I'll think about it some more. > > Again, it's not about a testsuite or features but about the API. And as I mentioned earlier that is probably the least interesting thing to look at as it is the easiest thing for you to fix at this point in time. Christian |
From: Marco L. <ml...@cs...> - 2004-08-26 08:02:04
|
Christian Fredrik Kalager Schaller wrote: [..] > > >I would think this is the least interesting thing to look at in this >setting cause if the Qt style bindings to GStreamer (or if someone made >Qt style bindings for MAS) isn't exactly what you want you can change >them to fit what you want in the coming year. In his talk at the >conference Mathias Ettrich said that in some was Qt was basically a >binding to a lot of different C libraries, I remember he mentioned >freetype as one. So basically you should be able to get a Qt binding >that is just as good and comfortable as the rest of Qt for you to >program in and with. The important things for KDE to evaluate IMHO, >especially considering that KDE 4.0 is just a year a way is features, >maturity, interoperability, viability and cross-platform support. > > > It is interesting that you mention bindings, because one motivation for developing NMM was to provide high-level C++ bindings to all these multimedia C libraries that run inside our plug-ins, e.g. for encoding and decoding. Concerning maturity, etc. : yes, these are certainly very important points for projects like KDE. And its up to the people from KDE to decide about this. However, to do such a decision, one has to develop some test programs. Getting familiar with the development using MAS, GStreamer, or NMM (and making life easier) was the intention of starting this comparison. [..] > > >> >> >Yes, but I am not so sure that in most of these cases having the media >framework on both sides is the right solution. For the thin-client >scenario for example I think having a lightweight limited functionality >media server client is the better solution and then have the media >framework on the other end do the needed transcodings and adaptions in >order for that light client to be able to play/display what you are >sending it. > >My way of solving the exact demo done at the conference would be to have >a deamon running which offered a rendevouz/zeroconf services which it >broadcast over bluetooth/wlan. When you then arrive with your handheld >which has bluetooth/wlan it would inform the owner that X number of >services was available and ask if she/he wants to transfer the stream to >any of them. I guess you don't really want to be playing from your PDA >when you get home anyway, so I guess adding some functionality which >transfered the playback to the sound collection on your central >server/machine would be the final solution to the issue I guess. > > > Yes, as I mentioned in my talk, you can use traditional client/server streaming approaches to develop such applications. However, it is not a question of if you can realize such applications with the client/server approach, it is a question of how easy (or painful) it is to do so. Or if other programming models are better suited. Also, you mainly speak about service discovery, which is to some extent the easiest step, because there exist many working solutions out there that can be used. The 'services' you speak of are mainly black-boxes: you have to discover what kind of functionality they offer, what kind of media data they allow to receive and they will provide, ... What you do not mention - and what also needs to be provided to develop demos like we showed at the KDE conference - is: * Type-safe control interfaces for distributed components (e.g. for controlling the DVD menus conveniently) * Type-safe connection between your application and the 'service' * Sharing of parts of running applications (e.g. flow graphs) * Distributed synchronization between different applications * Seamless handover of media playback between different devices To make this more clear: First of all, for purely local DVD playback, you will have to write an application (the flow graph aka pipeline). Also for TV, you will have to do that. This work has to be done using whatever multimedia framework you use. Then, using your 'services' to provide distributed functionality you would need to write a service for remote access to TV and a service for remote access to DVD (e.g. because TV allows to change channels, DVDs allow to navigate within menus). You are also fixed to where work like transcoding is done (e.g. transcoding needed for the mobile client is done on the host the service runs on?). And you still need to solve all the other issues pointed out above. NMM uses a completely different approach: You will only have to do the work needed for local DVD playback or TV once (e.g. provide the flow graph). Plus you need to add 2-3 lines of code to distribute that application (distributed flow graph). Or 2-3 more lines to share your application (shared distributed flow graph). Have fun, Marco. |
From: Thomas V. S. <th...@ap...> - 2004-08-25 16:13:51
|
Hi, > > I'm curious about (3) - why should it not be done automatically ? > > Both should be possible. In case you don't want it to be done behind your > back, when you want to have a special setup... Sure. But automatic is more important than manual, since you have to write less code then :) I agree that both should be possible, so I'm fine with having both. But let's not pretend we can measure an API for media playback without having it work automatically. > This not about writing an especially usefull app but about looking how the API > "feels". While I find it interesting to see how to set up the framework to > play a mp3 file it's of course also interesting how much the framework can do > on it's own. > But... this is not really about comparing features, it's more about comparing > APIs. I guess that depends on point-of-view. API is volatile in the sense that API can be added, or libraries added on top, for easier manipulation. In fact, I'd say that a more useful exercise to evaluate frameworks would be for *someone else* to write an application with an as-yet not existing API, and then the frameworks should implement the API. But that's just my opinion. > > > In a final step, we would like to extened the helloworld program > > > (helloworld I) to allow for distributed playback (helloworld III): > > > > > > (1) The component for reading data from a file should be located on the > > > local host. The component for decoding, and playing the audio data should > > > be located on remote host. > > > > > > Notice that this third example should also demonstrate how easy (or > > > painful) it is to develop networked multimedia applications using the > > > particular framework. We hope that this will finally show that > > > developing distributed multimedia applications means more than "well, > > > simply write a component for streaming data and put that into your > > > pipeline". > > > > Not sure why the third is important. While it's important for > > multimedia frameworks to be able to do things like this, I don't see the > > value of this for a desktop environment. Can you provide a use case > > where this makes sense ? > > Thin clients, conferencing, cool stuff like shown in the NMM talk at > aKademy ;-) We all love cool stuff :) But I'm not sure I follow the description of helloworld III as it is currently stated. At the very least there'd need to be two programs, no ? > > Also, I don't think it's smart to do this for audio only. We all know > > audio is the easiest to get right anyway, and audio presents a lot less > > challenge to frameworks. > > Uh, I don't agree that audio is the easier part. Video is easier. > Synchronization isn't easy, though. Maybe I didn't make myself clear. Since video most of the time also contains audio, video is more difficult than audio, since it contains audio plus something else. For example, take into account the fact that you all of a sudden need demuxers, that there is a wealth of additional formats that need to be supported, all with particular quirks, ... Unless you feel that a framework that can play mp3's wonderfully well, but blows up when you hand it an .avi file, is an acceptable framework for KDE :) > > I'm sure I could come up with other things that are important to be > > tested, I'll think about it some more. > > Again, it's not about a testsuite or features but about the API. Same here. I'm also talking about the API. But you need to make sure that you can evaluate the framework through the API you give; for example, KDE people have said they want to be able to record as well. No point in evaluating API's on playback API only then, IMO. Thomas Dave/Dina : future TV today ! - http://www.davedina.org/ <-*- thomas (dot) apestaart (dot) org -*-> Even bums don't not got a car <-*- thomas (at) apestaart (dot) org -*-> URGent, best radio on the net - 24/7 ! - http://urgent.fm/ |
From: Marco L. <ml...@cs...> - 2004-08-25 16:01:39
|
Thomas Vander Stichele wrote: >Hi Marco, > > > > >>(1) Allow the playback of an encoded audio file (e.g. MP3). This will >>result in similar setups: a component for reading data from a >>file connected to a component for decoding connected to a component >>for audio output. (Together, this is called "pipeline" or "flow >>graph"). >>(2) Set the filename of the file to be read. >>(3) Manually request/setup this functionality, i.e. no automatic setup >>of flow graphs. >>(4) Include some error handling. >> >> > >I'm curious about (3) - why should it not be done automatically ? So >you're saying you just want an application that can only play mp3's ? >Personally I think a much better test would be to have a helloworld that >can take a media file and just play it, whatever type it is. That's >what users care about, anyway. > > Sure, there are a lot of users that simply want to play back files. But as stated in my first email, this comparison is intended to compare the programming model and the APIs to be used by programmers. Automatic setup of flow graphs is definitely a nice feature. However, a programmer will have to use the API to develop such a feature. Providing some "feeling" for that was the idea of this first example. Furthermore, such a feature is definitely not something that belongs to the core API of any multimedia framework; it's an extension built on top of the core. > > >>In a second step, we would like to extend the helloworld program with >>following feature (helloworld II): >> >>(1) Add a listener that gets notified if the currently playing file >>has ended, i.e. this listener is to be triggered after the last byte >>was played by the audio device. >> >> > >What sort of thing is your listener ? An in-program function callback ? >Another process ? Something else ? > just take a look at our source code: http://graphics.cs.uni-sb.de/NMM/current/Docs/helloworld/x122.html so, basically it's an in-program function callback. > > > >>In a final step, we would like to extened the helloworld program >>(helloworld I) to allow for distributed playback (helloworld III): >> >>(1) The component for reading data from a file should be located on the >>local host. The component for decoding, and playing the audio data should >>be located on remote host. >> >>Notice that this third example should also demonstrate how easy (or >>painful) it is to develop networked multimedia applications using the >>particular framework. We hope that this will finally show that >>developing distributed multimedia applications means more than "well, >>simply write a component for streaming data and put that into your >>pipeline". >> >> > >Not sure why the third is important. While it's important for >multimedia frameworks to be able to do things like this, I don't see the >value of this for a desktop environment. Can you provide a use case >where this makes sense ? > > there are a lot of examples, just take a look at our slides of the talk. >Also, I don't think it's smart to do this for audio only. We all know >audio is the easiest to get right anyway, and audio presents a lot less >challenge to frameworks. > > yes, there might be other examples. However, the idea was to compare the basic programming model and API of the frameworks. >I'm sure I could come up with other things that are important to be >tested, I'll think about it some more. > > Yes, sure. However, before everyone comes up with some more things to compare, I think it would be great if we could first finish this comparison. Again: together, the idea was to compare the basic programming model and API of the frameworks. We think that the provided examples are very suitable for that. Have fun, Marco. |
From: Thomas V. S. <th...@ap...> - 2004-08-25 16:29:41
|
Hi Marco, > Sure, there are a lot of users that simply want to play back files. But > as stated in my first email, this comparison is intended to compare the > programming model and the APIs to be used by programmers. ... but it's worth seeing what you have to do, API-wise, with a framework, to get it to play back more than one format. It's quite easy to play only one format statically; with a requirement like that even libmad can participate in this test :) As soon as you want to play *two* formats, though, it's a lot more interesting to see how you need to use the API to be able to do that. Especially since the actual code you'll have to write for your application will need to use this code, no ? > Automatic setup of flow graphs is definitely a nice feature. However, a > programmer will have to use the API to develop such a feature. Providing > some "feeling" for that was the idea of this first example. It depends on the framework. I'd say that a framework that does this for you is easier to use than one that doesn't. > Furthermore, > such a feature is definitely not something that belongs to the core API > of any multimedia framework; it's an extension built on top of the core. Matter of opinion. FWIW, in GStreamer, that's exactly how it's done. No autoplugging in the core, but there is autoplugging available, accessible through the same API as for "static" pipelines, more or less. > >Also, I don't think it's smart to do this for audio only. We all know > >audio is the easiest to get right anyway, and audio presents a lot less > >challenge to frameworks. > > > yes, there might be other examples. However, the idea was to compare the > basic programming model and API of the frameworks. Agreed. The point is, you can't do a valid comparison between *multimedia* frameworks if you don't throw *multi*media at the tests. What good is a framework that is really nice for audio-only stuff, but falls apart for video ? What good is an API that's really nice for audio stuff, but can't handle video ? > Yes, sure. However, before everyone comes up with some more things to > compare, I think it would be great if we could first finish this comparison. Heh :) To get an objective comparison, you first need to have the "contestants" agree on the premises of the comparison. If you want to do this right, you need to get the foundation right as well. Otherwise we might as well do a comparison based on a set of helloworld examples that did focus on autoplugging and didn't focus on networking :) I'm not at all saying we should, because that would be biased as well - I'm just saying, we need to agree on the rules before playing by them, if we want to give a good comparison to the people that matter here, which are the KDE multimedia guys. > Again: together, the idea was to compare the basic programming model and > API of the frameworks. We think that the provided examples are very > suitable for that. I respectfully disagree, they are not broad enough to give a good idea of what code would need to be written for a playback application. Let's iterate the playing field some more before starting :) Thomas Dave/Dina : future TV today ! - http://www.davedina.org/ <-*- thomas (dot) apestaart (dot) org -*-> They say if you love somebody you have got to set them free but I would rather be locked to you than living in this pain and misery <-*- thomas (at) apestaart (dot) org -*-> URGent, best radio on the net - 24/7 ! - http://urgent.fm/ |
From: <ram...@ya...> - 2004-08-25 18:11:44
|
It is not so important how the API "feels" to the programmer. That can impact your first days of exposure , not more. It is much more important that everything that you need to do is posible. That cannot be found by a simple evaluation. Only experience of months programming with a framework can tell you it. Perhaps an (offtopic) example can be helpful. According to the criterion above Modula-2 is a great programming language. Its syntax is certainly clean, arrays are intrinsic types, ... but you cannot allocate an array of variable size. Other restrictions are even less obvious. For instance, in this language one cannot make unsafe conversions, and thus one cannot pass an aditional "void pointer" argument to callbacks, so that when routine A calls B passing a callback to C, if one wants that A can pass an argument to B so that it is passed to C, B must know the exact type of this argument. So if one wants to change the type of information that A delivers to C through B, B would have to be modified or recompiled. This illustrates that the most important difference between different systems is not so much in how they feel, but how many corner cases do they have and how well are they handled. I hope that this is useful. ______________________________________________ Renovamos el Correo Yahoo!: ¡100 MB GRATIS! Nuevos servicios, más seguridad http://correo.yahoo.es |
From: Marco L. <ml...@cs...> - 2004-08-26 08:02:49
|
Thomas Vander Stichele wrote: [..] > > >>>Also, I don't think it's smart to do this for audio only. We all know >>>audio is the easiest to get right anyway, and audio presents a lot less >>>challenge to frameworks. >>> >>> >>> >>yes, there might be other examples. However, the idea was to compare the >>basic programming model and API of the frameworks. >> >> > >Agreed. The point is, you can't do a valid comparison between >*multimedia* frameworks if you don't throw *multi*media at the tests. >What good is a framework that is really nice for audio-only stuff, but >falls apart for video ? What good is an API that's really nice for audio >stuff, but can't handle video ? > > sure we can add multi-media, e.g. audio, video, ..., (how about SMIL? No, just kidding ;) However, as stated in my first email: The audio player is meant as starting point, and: these features are simple enough to be provided for all three frameworks. [..] > > >>Again: together, the idea was to compare the basic programming model and >>API of the frameworks. We think that the provided examples are very >>suitable for that. >> >> > >I respectfully disagree, they are not broad enough to give a good idea >of what code would need to be written for a playback application. Let's >iterate the playing field some more before starting :) > > ... but they are simple enough as a starting point. And: if a framework is generic, then setting one capability (e.g. setting the filename of the source plug-in) is handled as setting any other capability. Also, registering a listener (or callback) for one event is handled generically. So these examples already handle a lot of stuff. Have fun, Marco. |
From: Thomas V. S. <th...@ap...> - 2004-08-26 08:55:52
|
Hi, > sure we can add multi-media, e.g. audio, video, ..., (how about SMIL? > No, just kidding ;) > However, as stated in my first email: The audio player is meant as > starting point, Ok, so we have a difference of opinion. My point is that "starting point" doesn't tell you enough about what you want to know to get a good evaluation. Otherwise we might as well compare the different calls needed to initialize the framework without doing anything. > and: these features are > simple enough to be provided for all three frameworks. So is video. Or, rephrased - if any multimedia framework doesn't play video, then the framework shouldn't even be a "contestant" in the comparison. Video is not something you can "add on later". > And: if a framework is generic, then setting one capability (e.g. > setting the filename of the source plug-in) is handled as setting any > other capability. In a theoretically perfect model and world, yes. In practice, any abstraction is always leaky in the real world. The point to consider is how these leaks are handled. Lots of examples of this exist; xvideo output only allowing one xv port, only allowing a maximum size, webcams only allowing a fixed set of framerates, v4l devices only allowing certain width by height, network connections failing, ... Anyway, like I said before, if you want a level playing field to compare the three on, you need to get the three frameworks to agree on the level playing field first. I personally don't see the merit in writing a helloworld that just plays an mp3, because I can write that with only libmad already. The examples should highlight API and strong or weak points in current API and design. There's no point in setting up examples tailored to show strengths of one of the three frameworks then seeing how the others match up. For example, for GStreamer, helloworld 3 would show a weak point in network transparency, and helloworld 1 would show a strong point in ease of setting up a decoding pipeline. That's probably because for end users ease of decoding is slightly more important than network transparency in our opinion. All that would prove is that the examples were tailored to show something specific, not something the KDE people want to know. Anyway, seems to me the KDE people are already figuring out themselves how to do an evaluation; they're writing an API they'd like and then the backends should implement this. Which sounds to me like the correct test, originating from the right people. Thomas Dave/Dina : future TV today ! - http://www.davedina.org/ <-*- thomas (dot) apestaart (dot) org -*-> If only you'd come back to me If you laid at my side wouldn't need no mojo pin to keep me satisfied <-*- thomas (at) apestaart (dot) org -*-> URGent, best radio on the net - 24/7 ! - http://urgent.fm/ |
From: Matthias W. <ma...@st...> - 2004-08-26 10:28:43
|
Hi, actually, I'd like to focus on a different view than just ease of use of an API. I don't really want to see KDE exchanging Arts Hell with MAS, NMM or GStreamer Tartaros. Thus, other points like stability of the ABI, size of the development community, focus of development are more important anyway. Actually, KDE as a Desktop Environment has very little requirements regarding its multimedia framework. I'd like to explicitely exclude multimedia applications for now, those should be able to use any framework their developers think would fit. KDE should stop imposing its own framework on them, this will only cause bloat. KDE currently has some really nice features apart from its multimedia applications which need to be kept: - audible notifications (KNotify) - pre-listening of audio files, this must work across network through kioslaves, too, without having to download the file first. - static or even animated preview of video files, possibly across a network, too. - you get the tune. This is sort of a minimum requirement. We should seek to make this possible with whatever framework is available. Meaning: In KDE, we need an adaptor that implements those few, generic use cases, using either MAS, NMM or GStreamer, or <plug you favorite candidate here>. This shouldn't be too difficult. I volunteer to work on the KDE::PlayObject stuff to make it more generic and not expose details about the underlying framework any more, if someone else steps up to write drivers for MAS, NMM and GStreamer. That way, we'd loosen KDE's dependency on a particular multimedia backend and circumvent the need to do one ourselves (like we tried with arts). |
From: Marco L. <ml...@cs...> - 2004-08-26 12:02:40
|
Matthias Welwarsky wrote: [..] >This is sort of a minimum requirement. We should seek to make this possible >with whatever framework is available. Meaning: In KDE, we need an adaptor >that implements those few, generic use cases, using either MAS, NMM or >GStreamer, or <plug you favorite candidate here>. This shouldn't be too >difficult. I volunteer to work on the KDE::PlayObject stuff to make it more >generic and not expose details about the underlying framework any more, if >someone else steps up to write drivers for MAS, NMM and GStreamer. > > I personally do agree with this approach, e.g. being independent of a particular multimedia framework is a great feature of amaroK. If this is the way KDE decides to go, I think we will try to help as much as possible. Have fun, Marco. |
From: Michael N. <mic...@gm...> - 2004-08-26 18:05:25
|
On Wednesday 25 August 2004 14:49, Marco Lohse wrote: > Hi there, > > from the feedback we got at the KDE conference I think one important > thing for KDE multimedia developers (as well as for other multimedia > developers) would be to have some comparison between MAS, GStreamer, > and NMM. > > We would like to start this comparison with a look at two things: > > (1) the programming models (i.e. what code you have to write down to > achieve certain things) and > (2) the APIs provided by the different frameworks (in correspondance > to Matthias Ettrich's statement "the API is to the programmer what a > GUI is to the end user"). I can't help noticing that, as usual, portability seems to be of no concern= =2E=20 =46rom the three software packages compared here, only gstreamer seems to m= ake=20 an actual effort to run on platforms !=3D Linux (it's the only one ported t= o=20 =46reeBSD as of now, too). MAS at least states they want to be portable=20 (non-Linux development seems to be essentially not happening though) and NM= M=20 is clearly focused on Linux. =2D-=20 ,_, | Michael Nottebrock | lo...@fr... (/^ ^\) | FreeBSD - The Power to Serve | http://www.freebsd.org \u/ | K Desktop Environment on FreeBSD | http://freebsd.kde.org |
From: Marco L. <ml...@cs...> - 2004-08-27 07:30:22
|
Michael Nottebrock wrote: [..] > > >I can't help noticing that, as usual, portability seems to be of no concern. >From the three software packages compared here, only gstreamer seems to make >an actual effort to run on platforms != Linux (it's the only one ported to >FreeBSD as of now, too). MAS at least states they want to be portable >(non-Linux development seems to be essentially not happening though) and NMM >is clearly focused on Linux. > > > well, for NMM, that is only partly true. From the point of software desgin, we did care about portability: All platform dependent code is completely abstracted and therefore hidden within classes. So there is no inherent portability problem. The only reason NMM is currently only supported on Linux is quite simple: the NMM team only has access to Linux systems (PCs or ARM PDA, e.g. Compaq iPAQ). If someone is willing to help with the support for other platforms, you are heavily welcome. Have fun, Marco. |