Thread: Re: [oll-user] Call for initial feedback (Page 2)
Resources for LilyPond and LaTeX users writing (about) music
Status: Alpha
Brought to you by:
u-li-1973
From: Urs L. <ul...@op...> - 2013-03-28 18:02:24
|
Am 28.03.2013 17:07, schrieb Joseph Rushton Wakeling: > .. > > Looking into that code, I realized there is another issue. Your stated licence > is GPL. How is that meant to interact with the case where e.g. I write an > original piece of music and use OLLib's toolbox in my Lilypond source file? I > am reasonably sure that if I release only the resulting PDF, there would not be > an issue, but if I want to distribute the .ly source file(s) then almost > certainly GPL requirements would kick in and would therefore force me to give a > GPL licence to my piece of music. > > This seems to me to be unacceptable overreach on the part of the licensing. It > may be worth raising this on the LP lists as it probably also applies to LP > \include's. > Just one thought (don't have more time now): Could it be that the lilypond source files that include OLLib or any other files from the LIlyPond distribution could be regarded as 'documents' so that the GPL doesn't apply in the way you suggested? I think there is something to that, otherwise we'd have a similar situation with LaTeX packages, isn't it? The GPL doesn't say that the _usage result_ of GPLed software must be GPLed. And although .ly files are source code, I'd consider them rather documents than software. ??? |
From: Joseph R. W. <jos...@we...> - 2013-03-28 18:29:25
|
On 03/28/2013 07:01 PM, Urs Liska wrote: > I think there is something to that, otherwise we'd have a similar > situation with LaTeX packages, isn't it? Aren't most LaTeX packages licensed differently from GPL? > The GPL doesn't say that the _usage result_ of GPLed software must be GPLed. > And although .ly files are source code, I'd consider them rather > documents than software. I think there's a level of truth to that and also to David's noting that there is not actually a combined entity being formed. But I still find the situation far too ambiguous for my taste, and tweaking the licence (or adding a clear exception) might be the easiest way to resolve that. |
From: Urs L. <ul...@op...> - 2013-03-28 13:24:10
|
Am 28.03.2013 13:19, schrieb Joseph Rushton Wakeling: > On 03/20/2013 04:50 PM, Janek Warchoł wrote: >>> If that would be possible, we could have one SourceForge project only, >>> which would be less confusing probably. >> I like tinkering with git and i think i could learn how to merge >> repositories. However, maybe git submodules would be a correct answer? A >> submodule is just a repository inside another repository, so that we could >> have a big repo containing all projects, while all of them would remain to >> be separate repos. >> I don't know how well that plays with SourceForge, however. > Sorry to come to this late in the day, but I don't see the point in having > one-repo-to-rule-them-all _unless_ done via submodules -- the separation into 3 > different projects is quite correct IMO, as each part is clearly independently > useful (or at least, such dependencies as exist are one-way only). Hm, I'm not really sure. If I'm not completely mistaken (and your linked examples doesn't indicate so) then a submodule is kind of a link to another git repository. So you have - a 'toplevel' repository - one or more accompanying repositories - one or more submodules in the main repository that have the accompanying repos as their remote(s) I'm somewhat reluctant to this overhead to keep everything in sync. But I have to admit I don't have any practical experience with that. And I haven't found a way so far (for myself) to cope with the growing number of repos that have to be kept in sync every day. The dependencies between the parts is there only for contributors, but: Everybody who wants to contribute is responsible to contribute the relevant documentation for the contribution. And to compile the manuals one needs everything: The basic LaTeX stuff, musicexamples, and lilyglyphs. So if I want to contribute, I'd have to clone the repos, have to take care of not breaking anything with the remotes of the submodules. Maybe I'm wrong with that, but it somehow looks scary to me. Users who only want to _use_ one or more of the parts will get archived 'releases' anyway. I'm commenting on the structure and relation of the parts in your other email. Best Urs > > If you want a nice example of submodules, you can see it here with LDC (a > compiler for the D programming language based on the LLVM backend). This > project has the language runtime, standard library (Phobos) and test suite each > as separate submodules. > https://github.com/ldc-developers/ldc > > Since OLL is moving back to GitHub, that shouldn't be a problem. > > ------------------------------------------------------------------------------ > Own the Future-Intel® Level Up Game Demo Contest 2013 > Rise to greatness in Intel's independent game demo contest. > Compete for recognition, cash, and the chance to get your game > on Steam. $5K grand prize plus 10 genre and skill prizes. > Submit your demo by 6/6/13. http://p.sf.net/sfu/intel_levelupd2d > _______________________________________________ > openlilylib-user mailing list > ope...@li... > https://lists.sourceforge.net/lists/listinfo/openlilylib-user |
From: Joseph R. W. <jos...@we...> - 2013-03-28 14:27:48
|
On 03/28/2013 02:23 PM, Urs Liska wrote: > I'm somewhat reluctant to this overhead to keep everything in sync. > But I have to admit I don't have any practical experience with that. > And I haven't found a way so far (for myself) to cope with the growing > number of repos that have to be kept in sync every day. git submodule update [--init] doesn't work? What are the particular sychronization problems that you see arising? > The dependencies between the parts is there only for contributors, but: > Everybody who wants to contribute is responsible to contribute the > relevant documentation for the contribution. And to compile the manuals > one needs everything: The basic LaTeX stuff, musicexamples, and > lilyglyphs. So if I want to contribute, I'd have to clone the repos, > have to take care of not breaking anything with the remotes of the > submodules. Maybe I'm wrong with that, but it somehow looks scary to me. > Users who only want to _use_ one or more of the parts will get archived > 'releases' anyway. Let's put it this way: "breaking" dependencies should be only one way, i.e. if you change the toolbox or lilyglyphs then perhaps the manuals won't work, but changing the manuals shouldn't _break_ those upstreams. (They might contain incorrect or out-of-date information, but that's a slightly different problem.) Now, in turn, it ought to be possible to organize and plan things so that breaking changes are rare. This is effectively a policy issue. To put this in context, KDE is a huge project with uncountable inter-dependencies among projects, and which originally was tracked in one super-huge SVN repo. But on switching to git, they moved to the one-submodule-per-package model. I think you need to think of it like this: users who only want to _use_ one or more of the parts may operate off archived releases, but what about users who only want to _work_ on one of the parts? It's really not clear to me why someone who wants to work solely on lilyglyphs (say) should have to pull in OLLib or the manuals. Likewise it's not clear to me why someone who contributes something to the toolbox should have to contribute documentation beyond the most basic description (which is why I suggested drawing a line between the basic toolbox documentation and more in-depth tutorials and manuals). It just seems to me that, looking at things, your sense of interdependency between the parts is a social rather than a technical requirement, and that tweaking those social requirements could still get you where you want to go, just making it much technically easier. LP contribution is in itself unnecessarily complicated because the requirements for doc and code contributions overlap in a way that they shouldn't have to, and I don't think you should make the same mistakes. For example, one of the benefits of DVCS is that it means that I can work on my little submodule by myself, do what I want to, and only worry about breakages when I'm submitting for merge -- when it gets run through the test suite. _That's_ how you keep things in sync, by automated testing prior to accepting a pull request, rather than by forcing everyone to have the whole massive archive on their machine and keep it in sync manually (which apart from a few hyper-virtuous individuals won't happen in any case). I think GitHub has some nice tools for integrating pull requests with automated testing, and this could be worth looking into. |
From: Urs L. <ul...@op...> - 2013-03-28 17:01:19
|
Am 28.03.2013 15:27, schrieb Joseph Rushton Wakeling: > On 03/28/2013 02:23 PM, Urs Liska wrote: >> I'm somewhat reluctant to this overhead to keep everything in sync. >> But I have to admit I don't have any practical experience with that. >> And I haven't found a way so far (for myself) to cope with the growing >> number of repos that have to be kept in sync every day. > git submodule update [--init] doesn't work? I didn't find this in http://git-scm.com/book/en/Git-Tools-Submodules. The man page for git-submodule states that command, but I have to admit that I don't fully understand it. But that man page also says: "Submodules allow foreign repositories to be embedded within a dedicated subdirectory of the source tree". That's what I don't really understand in our context: Both documents suggest to use submodules to integrate existing remote secondary repositories in the main one. Why this overhead of having a remote repository and its submodule 'mirror'? And the caveats I read about in the progit book don't look reassuring either. > > What are the particular sychronization problems that you see arising? Not really problems, its rather the complexity of the amount of repos. An example: If I work on, say, the manual for 'musicexamples' I may run into updating the stylesheet package. If everything is in one repository I will surely notice the need for committing this change, at least if I want to power down the computer and see that git status isn't clean. If the files were in separate repositories I would have to take care myself that I don't forget to update also the repo with the stylesheet in. Or: Each time I have to change a computer I have to go through all the repos to check if there are changes to be fetched and merged (or rebased). That's quite tedious and requires a fair amount of concentration in order not to omit a branch, for example. > >> The dependencies between the parts is there only for contributors, but: >> Everybody who wants to contribute is responsible to contribute the >> relevant documentation for the contribution. And to compile the manuals >> one needs everything: The basic LaTeX stuff, musicexamples, and >> lilyglyphs. So if I want to contribute, I'd have to clone the repos, >> have to take care of not breaking anything with the remotes of the >> submodules. Maybe I'm wrong with that, but it somehow looks scary to me. >> Users who only want to _use_ one or more of the parts will get archived >> 'releases' anyway. > Let's put it this way: "breaking" dependencies should be only one way, i.e. if > you change the toolbox or lilyglyphs then perhaps the manuals won't work, but > changing the manuals shouldn't _break_ those upstreams. (They might contain > incorrect or out-of-date information, but that's a slightly different problem.) That's right (and also the case currently) > > Now, in turn, it ought to be possible to organize and plan things so that > breaking changes are rare. This is effectively a policy issue. > > To put this in context, KDE is a huge project with uncountable > inter-dependencies among projects, and which originally was tracked in one > super-huge SVN repo. But on switching to git, they moved to the > one-submodule-per-package model. > > I think you need to think of it like this: users who only want to _use_ one or > more of the parts may operate off archived releases, but what about users who > only want to _work_ on one of the parts? It's really not clear to me why > someone who wants to work solely on lilyglyphs (say) should have to pull in > OLLib or the manuals. Maybe you're right. But: As it is now I have to tell the contributor: "clone the repository and make sure that LaTeX finds this folder and LilyPond finds this folder. You may have stuff there that you don't need, but you get everything you need with one clone." If there were smaller chunks I'd have to say: "You may clone the module you want to work on, but you will have to download and install the musicexamples and lilyglyphs packages in order to compile the docs. > Likewise it's not clear to me why someone who contributes > something to the toolbox should have to contribute documentation beyond the most > basic description (which is why I suggested drawing a line between the basic > toolbox documentation and more in-depth tutorials and manuals). I will have to think about that, but I'll also comment on your other email about that. > It just seems to me that, looking at things, your sense of interdependency > between the parts is a social rather than a technical requirement, and that > tweaking those social requirements could still get you where you want to go, > just making it much technically easier. > > LP contribution is in itself unnecessarily complicated because the requirements > for doc and code contributions overlap in a way that they shouldn't have to, and > I don't think you should make the same mistakes. > > For example, one of the benefits of DVCS is that it means that I can work on my > little submodule by myself, do what I want to, and only worry about breakages > when I'm submitting for merge -- when it gets run through the test suite. > > _That's_ how you keep things in sync, by automated testing prior to accepting a > pull request, rather than by forcing everyone to have the whole massive archive > on their machine and keep it in sync manually (which apart from a few > hyper-virtuous individuals won't happen in any case). > > I think GitHub has some nice tools for integrating pull requests with automated > testing, and this could be worth looking into. OK, but I'll have to come back later to this, as it seems somewhat too much right now ;-) Best Urs > > ------------------------------------------------------------------------------ > Own the Future-Intel® Level Up Game Demo Contest 2013 > Rise to greatness in Intel's independent game demo contest. > Compete for recognition, cash, and the chance to get your game > on Steam. $5K grand prize plus 10 genre and skill prizes. > Submit your demo by 6/6/13. http://p.sf.net/sfu/intel_levelupd2d > _______________________________________________ > openlilylib-user mailing list > ope...@li... > https://lists.sourceforge.net/lists/listinfo/openlilylib-user |
From: Joseph R. W. <jos...@we...> - 2013-03-28 17:51:07
|
On 03/28/2013 05:59 PM, Urs Liska wrote: > Why this overhead of having a remote repository and its submodule 'mirror'? > And the caveats I read about in the progit book don't look reassuring > either. Not familiar with the remarks in the progit book, but the logic of submodules as I see it is that you can divide your codebase into separately versioned parts which can then be incorporated into multiple different projects. In the example I gave you, the LLVM-based D compiler, it makes sense to have the standard library as a submodule because that way contributors on different D compilers can contribute to and pull from the latest version of that standard library regardless of what compiler they're personally using. > Not really problems, its rather the complexity of the amount of repos. > An example: If I work on, say, the manual for 'musicexamples' I may run > into updating the stylesheet package. If everything is in one repository > I will surely notice the need for committing this change, at least if I > want to power down the computer and see that git status isn't clean. Wouldn't that kind of issue suggest that your repos are improperly modularized? For example, if an individual tutorial needs a stylesheet change, it might make sense to allow tutorials to have custom local stylesheets that are used alongside the global one. And if, reviewing the merge of a tutorial patch, there is a stylesheet change, it could be consciously decided to make that change to the global stylesheet instead. Put it another way -- if you have to make changes to the global stylesheet often, then your global stylesheet needs rethinking anyway. Preventing you from habitually making stylesheet changes along with tutorial changes is probably a good way to enforce good discipline of design. > If the files were in separate repositories I would have to take care > myself that I don't forget to update also the repo with the stylesheet in. > Or: Each time I have to change a computer I have to go through all the > repos to check if there are changes to be fetched and merged (or > rebased). That's quite tedious and requires a fair amount of > concentration in order not to omit a branch, for example. It _should_ be as simple as git pull, git submodule update -- one more command than you have to run anyway. > Maybe you're right. But: As it is now I have to tell the contributor: > "clone the repository and make sure that LaTeX finds this folder and > LilyPond finds this folder. You may have stuff there that you don't > need, but you get everything you need with one clone." If there were > smaller chunks I'd have to say: "You may clone the module you want to > work on, but you will have to download and install the musicexamples and > lilyglyphs packages in order to compile the docs. Well, you should still be able to do that with a master project and submodules. But if you're talking about a manual or a tutorial it's clear that you have a strong dependency that requires everything. If you're talking about just contributing to lilyglyphs, that _could_ be largely independent of everything else, if you design things right. For example, a lot of your sense of dependency between docs and code/LaTeX packages seems to stem from your concept of docs as a large-scale manual. But if instead you have micro-documentation for the individual toolboxes and packages (as you see with the D standard library), you shouldn't need that strong interdependency. Then you can still _have_ the large-scale manual, but as a separate project that builds on top of the micro-docs. |
From: Urs L. <ul...@op...> - 2013-04-06 11:40:08
|
Am 28.03.2013 18:50, schrieb Joseph Rushton Wakeling: > On 03/28/2013 05:59 PM, Urs Liska wrote: > ... >> Not really problems, its rather the complexity of the amount of repos. >> An example: If I work on, say, the manual for 'musicexamples' I may run >> into updating the stylesheet package. If everything is in one repository >> I will surely notice the need for committing this change, at least if I >> want to power down the computer and see that git status isn't clean. > Wouldn't that kind of issue suggest that your repos are improperly modularized? > > For example, if an individual tutorial needs a stylesheet change, it might make > sense to allow tutorials to have custom local stylesheets that are used > alongside the global one. And if, reviewing the merge of a tutorial patch, > there is a stylesheet change, it could be consciously decided to make that > change to the global stylesheet instead. This is a very good idea. It is completely practical to have an individual style sheet for use in an individual tutorial. Or a tutorial author could write his additions directly in the preamble of his 'subfile'. > > Put it another way -- if you have to make changes to the global stylesheet > often, then your global stylesheet needs rethinking anyway. Preventing you from > habitually making stylesheet changes along with tutorial changes is probably a > good way to enforce good discipline of design. I see your point, but that's not really the case here. I have to change the style sheets often because they are developed along the way. So I'll add an environment for a specific kind of table when I first need it. So the number of this kind of updates will fastly decrease. And it can be made simpler with your previous suggestion. Urs |
From: Joseph R. W. <jos...@we...> - 2013-04-06 13:30:21
|
On 04/06/2013 01:39 PM, Urs Liska wrote: > This is a very good idea. It is completely practical to have an > individual style sheet for use in an individual tutorial. > Or a tutorial author could write his additions directly in the preamble > of his 'subfile'. That said, you want to avoid this as much as possible, because it means that changes to the global stylesheet may break individual tutorials if there are clashes with their customizations. I know that probably sounds like a contradiction with what I said before, but not really. Priority should be something like: if you need a new stylesheet feature, try and design something general-purpose and useful to everyone that can go in the global stylesheet. If your need is really, really specific, then put it in a local stylesheet, but try and design it so that it's very unlikely to suffer from global stylesheet tweaks. What shouldn't be acceptable is someone making a custom stylesheet just as a lazy way of having to avoid thinking about how to tweak the global stylesheet. Of course, if you really like someone's tutorial, you could accept it as a short term measure to get the tutorial included and then the core OLLib people take responsibility for correcting the design ... :-) > I see your point, but that's not really the case here. > I have to change the style sheets often because they are developed along > the way. So I'll add an environment for a specific kind of table when I > first need it. So the number of this kind of updates will fastly > decrease. And it can be made simpler with your previous suggestion. As you say it's a short-term pain. So, in the long term it's irrelevant from the point of view of modularizing the documentation. |