#10 Support generated include files better

closed
nobody
None
5
2012-12-29
2012-12-05
Thomas Moore
No

From your incompatibilities list:

---
Makepp does not attempt to rebuild files included with the include statement unless the makefile contains a rule for building them before the include statement is seen. (It will attempt to rebuild the makefile itself, however.) This is normally used for handling include file dependencies, and is not as useful with makepp since you don't need to do that anyway.
---

Actually, I use this feature to create a simple makefile that only knows how to make its include files. gmake appears to:

1) read the main makefile completely, ignoring all -includes with no associated file
2) rebuild any include files for which it has rules, and which need rebuilding
3) restart make if any include files changed

makepp appears to:
1) read the main makefile
1a) rebuild any include files for which it already has rules when it encounters an include or -include, regardless of whether or not they need rebuilding

Yes, I'm probably one of the only people in the world that want this feature, and I have up until now been able to work around this behavior by disabling a few features in my build system, but it would be nice if I didn't have to work around that. Even with the workarounds, it's kind of annoying that include files get rebuilt regardless of whether or not they need it.

Another, related feature is the ability to check built include files before writing them out. With GNU make, I do a recursive make of /dev/null using standard input (the generated include file) as the makefile. I'm not sure if that's even possible with makepp. Perhaps that feature is already covered by calling some makepp internal function with perl. I already had to resort to this for auto-generating some dependencies.

I've attached the current version of my base makefile if you're interested. For details, see https://github.com/darktjm/literate-build/raw/master/build.pdf (or build.html in the same place). The rest of the makefiles are attached to either of these, with instructions on how to detach.

Discussion

1 2 > >> (Page 1 of 2)
  • Thomas Moore
    Thomas Moore
    2012-12-05

    base makefile; extracted from build.nw

     
    Attachments
  • Dear Thomas!

    What gmake does is weird. I remember back when 1-pass compilers were hailed as a big advance. Gmake turns in circles till it gets everything right. I very much prefer putting things in the order in which they are needed – makes life much easier! And doesn't break anything for gmake either, if you want to stay compatible.

    1a is not true after "regardless". Given include file generators with or without dependencies:

    a.makepp:
    &echo A=1 -o $(output)

    b.makepp: a.makepp
    &echo B=2 -o $(output)

    include a
    include b

    &echo A=$A B=$B

    and calling mpp; mpp I get

    makepp: Loading makefile `/run/shm/Makeppfile'
    makepp: Entering directory `/run/shm'
    &echo A=1 -o a.makepp
    &echo B=2 -o b.makepp
    A=1 B=2
    makepp: 2 files updated and 0 phony targets built
    makepp: Loading makefile `/run/shm/Makeppfile'
    A=1 B=2
    makepp: no update necessary

    Frankly your makefile is a bit much! Can you please whittle it down to a minimum that shows your point, without depending on anything else I might not have.

    The option -f- for "makefile from stdin" is possible since 2012-05-12 or 2.0.98.1. I don't know what you mean by recursive build of /dev/null. That should quietly succeed, or loudly if you try directly in /dev, where you don't have write permission for the logfile. But that is not necessarily the same as doing nothing, which even with gmake is only half possible. Certain side effects are there even with --dry-run.

    regards -- Daniel

     
  • Thomas Moore
    Thomas Moore
    2012-12-06

    First, let me apologize for not using the latest snapshot download. I do not want my build system to depend on an unreleased version of makepp, so I just checked the sourceforge "latest download" button, and assumed that 2.0 was it.

    ---
    I very much prefer putting things in the order in which they are
    needed – makes life much easier! And doesn't break anything for gmake
    either, if you want to stay compatible.
    ---

    When I first added makepp support a few years ago, I reordered things as you wanted. I was able to build almost everything with makepp, with a few minor modifications. The reason I opened this feature request in the first place is that I have a current need to generate an include file after generating some C code. This generator generates both the C code and another file which contains the names of the C files it generated. I want to be able to add rules based on those names. However, the rules to generate the C code are complex and mixed in, so reordering seemed like a difficult task. For now, I decided to skip generating the makefile parts and instead just parse the file within the action of a single rule to build everything needed.

    ---
    1a is not true after "regardless". Given include file generators with or
    without dependencies:
    ---

    OK, I guess you got me there. I was just reporting behavior I observed when I was regularly using makepp. I don't really use makepp regularly any more, but even as recent as a few weeks ago I observed that behavior. Now, I can't reproduce it for some reason. If I ever observe it again, I'll report it as a bug.

    ---
    Frankly your makefile is a bit much! Can you please whittle it down to a
    minimum that shows your point, without depending on anything else I might
    not have.
    ---

    That's part of the problem statement: computing the build order is not simple, and may depend on configuration options, including config options specified by files within the build order. I want to include the config files before the computation, but instead have to include them afterwards, and only allow the local (user-generated) config to override the build order computation.

    I guess your simple example is good enough for at least illustrating my second point. However, it is incompatible with GNU make, so let's make that happen:

    ---
    default:
    a:
    echo A=1>$@
    b: a
    echo B=2>$@
    -include a
    -include b
    default:
    @echo A=$A B=$B
    ---

    Illustrating my first point is just a matter of moving the -include statements up before the rules, although in this simple example there is no point in doing so. Note that they are -include statements, not include statements. Using include statements works with GNU make, but produces transient error messages and seems like incorrect behavior, anyway.

    ---
    The option -f- for "makefile from stdin" is possible since 2012-05-12 or
    2.0.98.1. I don't know what you mean by recursive build of /dev/null.
    That should quietly succeed, or loudly if you try directly in /dev, where
    you don't have write permission for the logfile. But that is not
    necessarily the same as doing nothing, which even with gmake is only half
    possible. Certain side effects are there even with --dry-run.
    ---

    What you say is true. What I want is to ensure that bad makefile components are never built. For the main makefile, this is accomplished by only sourcing it from one NoWeb file, and keeping it as small as possible. For the include files, this is accomplished by running a simple check. I can't give a specific target or no target (i.e. the default target), because no targets may be available. That's why I chose /dev/null.

    ---
    b: a
    echo B=2 | env -i $(MAKE) -n -f - /dev/null
    echo B=2 >$@
    ---

    The point of the first line is to fail before generating the output if the output is going to be invalid anyway. If I invoke this with the default, I get an error from recursive_makepp. If I invoke this with --traditional-recursive-make (or just replace $(MAKE) with makepp manually), I get "error: no targets specified and no default target". If I replace the echo with printf 'blah:\n\t@:\nB=2\n', I still get that error. If I also replace /dev/null with blah as the target, I still get that error. All three methods work fine with GNU make, including failing correctly if I introduce an error. That is with 2.0.98.2.

    There is one way I can force this to work, though (using --traditional-recursive-make):

    ---
    b: a
    ( \ trap "rm -f mf.$$$$" 0; \ echo B=2 > mf.$$$$; \ env -i $(MAKE) -n -f mf.$$$$ /dev/null \ )
    echo B=2 >$@
    ---

    So, I guess by looking into this a bit more, I've answered my own question. It's a bit uglier than -f-, but it works with both makepp (with --traditional-recursive-make) and GNU make.

    However, if you don't intend to ever support reloading on -include change, there is no point in supporting the -f- thing, either. After all, the ordering restriction ensures that invalid include files at least never prevent valid include files from being built after correcting the problem. With GNU make, though, the make will fail if any existing include file has errors, and the only workaround is to delete the files with errors so that they will be regenerated. That's why I put that check there in the first place.

     
  • ---
    This generator generates both the C code and
    another file which contains the names of the C files it generated. I want
    to be able to add rules based on those names. However, the rules to
    generate the C code are complex and mixed in, so reordering seemed like a
    difficult task.
    ---

    If you have a circular dependency you might be in trouble anyway. Otherwise, you should be able to find some order. However, your problem sounds vaguely as though the rule option :include, or the command line option --defer-include might be of some help...

    ---
    OK, I guess you got me there. I was just reporting behavior I observed
    when I was regularly using makepp. I don't really use makepp regularly any
    more, but even as recent as a few weeks ago I observed that behavior. Now,
    I can't reproduce it for some reason. If I ever observe it again, I'll
    report it as a bug.
    ---

    I'm not aware of anything that changed in this respect. Building include files is no different from other rules (except they get run before the end of the makefile is reached – the same is true of any target you "prebuild").

    ---
    computing the build order is not
    simple, and may depend on configuration options, including config options
    specified by files within the build order. I want to include the config
    files before the computation, but instead have to include them afterwards,
    and only allow the local (user-generated) config to override the build
    order computation.
    ---

    Build systems are rule based. You don't force order, other than through dependencies. From there it's completely up to the tool, and may vary from one run to another, especially with parallel builds.

    ---
    Note that they are -include statements, not include
    statements. Using include statements works with GNU make, but produces
    transient error messages and seems like incorrect behavior, anyway.
    ---

    I don't see why gmake has a bug with include, but to my mind that is the correct statement, since given the rules it should be possible to obtain these files. OTOH -include is for files that are there or not due to factors out of make's control, e.g. settings for some optional component.

    ---
    What I want is to ensure that bad makefile
    components are never built. For the main makefile, this is accomplished by
    only sourcing it from one NoWeb file, and keeping it as small as possible.
    For the include files, this is accomplished by running a simple check. I
    can't give a specific target or no target (i.e. the default target),
    because no targets may be available. That's why I chose /dev/null.

    b: a
    echo B=2 | env -i $(MAKE) -n -f - /dev/null
    echo B=2 >$@
    ---

    $(MAKE) is magical in mpp, piping the args back to the main process. This will fail if that means having more than one Makefile in a directory, as this is the anchor for mpp. You correctly spotted --traditional-recursive-make to avoid that.

    ---
    The point of the first line is to fail before generating the output if the
    output is going to be invalid anyway. If I invoke this with the default, I
    get an error from recursive_makepp. If I invoke this with
    --traditional-recursive-make (or just replace $(MAKE) with makepp
    manually), I get "error: no targets specified and no default target". If I
    replace the echo with printf 'blah:\n\t@:\nB=2\n'
    ---

    This seems to be a bug with -f-. To be followed up...

    ---
    However, if you don't intend to ever support reloading on -include change,
    there is no point in supporting the -f- thing, either.
    ---

    It was never meant as a makefile syntax checker. Instead it is something that is used heavily in the gmake regression tests. It is part of a patch idea to apply it to mpp.

    ---
    After all, the
    ordering restriction ensures that invalid include files at least never
    prevent valid include files from being built after correcting the problem.
    With GNU make, though, the make will fail if any existing include file has
    errors, and the only workaround is to delete the files with errors so that
    they will be regenerated. That's why I put that check there in the first
    place.
    ---

    Wouldn't it be easier to just make sure the files you generate are valid, rather than have everything crumble when they are not?

     
  • Thomas Moore
    Thomas Moore
    2012-12-07

    ---
    Wouldn't it be easier to just make sure the files you generate are valid,
    rather than have everything crumble when they are not?
    ---

    Yes, it would. However, believe it or not, during development, I make mistakes. The traditional method for development is to repeatedly make changes to the source file and initiate a build. If the build has errors, the cycle is repeated but with an emphasis on correcting the errors. Perhaps you missed the point that these makefiles are not the source files, but products of the source files. Checking for errors removes a minor nuisance: having to delete the product files in order to cause them to be rebuilt. Instead of "make; edit; make; curse; rm makefile.rules; make" I just do "make; edit; make".

    ---
    If you have a circular dependency you might be in trouble anyway.
    Otherwise, you should be able to find some order. However, your problem
    sounds vaguely as though the rule option :include, or the command line
    option --defer-include might be of some help...
    ---

    The problem is not circular dependency. The problem is that in order to put things in the order you prefer, I would have to make changes to unrelated source files, placing hooks that are there for the sole purpose of supporting this need. As I said, what I am doing now works fine, and has no circular dependencies (at least not that I introduced; makepp may add some automatically without notifying me).

    Also, allowing arbitrary file ordering makes it easier to meet one of the strict requirements of the entire system: code is ordered in the order of documentation, rather than the order the compiler (or build tool) wants it.

    ---
    Build systems are rule based. You don't force order, other than through
    dependencies. From there it's completely up to the tool, and may vary from
    one run to another, especially with parallel builds.
    ---

    In gmake, there is no issue whatsoever regarding ordering. Unless I screw up, parallel builds always work correctly, and execute in the order I specify (with dependecies, of course). The only things that vary from build to build are irrelevant orderings (as intended). If I actually had circular dependencies, GNU make would warn me about it. It's only in makepp that I have to put things in a particular order. That's why I'm asking for this to change.

    ---
    I don't see why gmake has a bug with include, but to my mind that is the
    correct statement, since given the rules it should be possible to obtain
    these files. OTOH -include is for files that are there or not due to
    factors out of make's control, e.g. settings for some optional component.
    ---

    I don't see it that way. GNU make correctly spots that the files do not exist, and flags that as an error. GNU make incorrectly (in my view) does not abort due to this error, and instead builds the files and restarts. I'm not going to argue about it, though.

    ---
    $(MAKE) is magical in mpp, piping the args back to the main process. This
    will fail if that means having more than one Makefile in a directory, as
    this is the anchor for mpp. You correctly spotted
    --traditional-recursive-make to avoid that.
    ---

    Incorrectly spotted, that is. Having to specify an obscenely long command-line option every time I do a make is counter to the purpose of the entire system: to make things easier. If there is a way to make this behavior the default via adding something to the makefile, I might consider it. And no, adding an alias to my .zshrc or writing a short wrapper script is not the correct answer here, either. Part of the purpose of this system is to allow others to use it as well, and I feel uncomfortable telling people to use disgusting command lines.

    ---
    It was never meant as a makefile syntax checker. Instead it is something
    that is used heavily in the gmake regression tests. It is part of a patch
    idea to apply it to mpp.
    ---

    What I meant was that if you don't intend to support the reason for my needing this feature, then *I* don't need it any more. If you still need it for whatever other reasons you have, I'm not going to say the feature is entirely useless.

    Currently, I am tempted to just remove all makepp support from my build system. It doesn't really buy me much (it only prevents C recompiles if I turn off #line directives, which in turn makes it hard to follow errors to their source line), and it screws up on several of my projects (e.g. it tries to build things it doesn't need to thanks to its "smart" command line processing, and sometimes executes actions without actually meeting the prerequisites, probably for the same reason). I simply don't have the energy to deal with makepp issues and also develop the code I actually care about at the same time. I am not seeking to replace gmake with something "better", but to provide an option which may bring some improvements with minimal effort. I will never drop gmake support in favor of makepp, but instead intended to support either one. Actually, I am seeking to replace it with something "better", but makepp is not even on the radar (Odin comes much closer to meeting my needs).

     
  • Hi Thomas,

    since what you are telling me doesn't make much sence to me, I went and studied noweb. Now I vaguely understand what we are talking about, but still not why it should cause so many problems. As I understand it, you're not actually generating any makefile stuff, you're only writing it into various files and extracting it. Mpp should be well capable of handling this situation, even if I never came across it before.

    Rather than debugging your complex setup, I'd like to start with a clear problem statement. It would be very helpful if you could explain your setup in easy words, then we should be able to find an easy solution. That is: an example of every artefact and how they're supposed to fall together. Especially, since your macros are so complex, I don't get those NOWEB_* variables. Just tell me example content and what it means.

    I now understand how doc order is important to you, but not why that affects build order. If you have make variables that extend one another, order would matter, but I wouldn't do such a thing across many files. As for rules (except prebuild stuff like include files) the order in which mpp sees them is irrelevant.

    Also what you are trying to do here, seems to be what mpp's command parsers and scanners do, namely find the dependencies (and outputs). Actually it would be easy to add to mpp, and it would probably make all your worries disappear. Actually I had such a task low on my radar for (La)Tex, but that's quite daunting with classes, images and various include statements.

    ---
    believe it or not, during development, I make
    mistakes. The traditional method for development is to repeatedly make
    changes to the source file and initiate a build. If the build has errors,
    the cycle is repeated but with an emphasis on correcting the errors.
    Perhaps you missed the point that these makefiles are not the source files,
    but products of the source files. Checking for errors removes a minor
    nuisance: having to delete the product files in order to cause them to be
    rebuilt. Instead of "make; edit; make; curse; rm makefile.rules; make" I
    just do "make; edit; make".
    ---

    You shouldn't ever need to manually delete a file produced by mpp. If it is wrong due to a wrong input, mpp will recreate it if the input changes. If the generating commend is wrong, mpp will see that too. I guess you have buried some dependency too deeply for mpp to spot it.

    ---
    The problem is not circular dependency. The problem is that in order to
    put things in the order you prefer, I would have to make changes to
    unrelated source files, placing hooks that are there for the sole purpose
    of supporting this need. As I said, what I am doing now works fine, and
    has no circular dependencies (at least not that I introduced; makepp may
    add some automatically without notifying me).
    ---

    Mpp certainly doesn't add circular dependencies, only the ones that are obvious, i.e. source files, include files, libraries. If just as an example you make a .h depend on the .o that was built from it then that would be a cycle.

    ---
    In gmake, there is no issue whatsoever regarding ordering. Unless I screw
    up, parallel builds always work correctly, and execute in the order I
    specify (with dependecies, of course). The only things that vary from
    build to build are irrelevant orderings (as intended). If I actually had
    circular dependencies, GNU make would warn me about it. It's only in
    makepp that I have to put things in a particular order. That's why I'm
    asking for this to change.
    ---

    Build order and makefile order are not related. We talked about why gmake handles what you do.

    ---
    Having to specify an obscenely long
    command-line option every time I do a make is counter to the purpose of the
    entire system: to make things easier. If there is a way to make this
    behavior the default via adding something to the makefile, I might consider
    it.
    ---

    It's meant to stick out like a sore thumb, because it breaks the reliability which is the big plus over gmake. You can shorten it to --traditional, and you can add it to .makepprc in the root of your build. But we should manage without it as I said!

    ---
    It doesn't really buy me much (it only prevents C recompiles if I
    turn off #line directives, which in turn makes it hard to follow errors to
    their source line),
    ---

    Not handling #line would break if your source includes __LINE__. But I think you are right: we should have it optional.

    ---
    and it screws up on several of my projects (e.g. it
    tries to build things it doesn't need to thanks to its "smart" command line
    processing,
    ---

    Do you only feel so, because it is "not important" or is it actually doing something wrong?

    ---
    and sometimes executes actions without actually meeting the
    prerequisites, probably for the same reason).
    ---

    That would be a bug if it's aware of the prereq. If it's too deeply buried, you must specify it on the rule.

     
  • Thomas Moore
    Thomas Moore
    2012-12-09

    I am trying to ask for a feature. I'm pretty sure you understand exactly what I want; you just want me to justify it. I don't think any amount of argument will convince you that this is even the least bit necessary, so I suggest you close this issue without further comment. I apologize for wasting your time with this.

    I wasn't going to do this, but I'll go ahead and respond to some of your comments:

    First, although I didn't think it would be necessary to know this since my request was so simple, here's a basic idea of how my system is supposed to work:

    You can think of NoWeb source files as archives. The "notangle" command simply extracts the archive member ("code chunk") with a given name (or a default if no name is given). If more than one source file is given, each source file's version of that code chunk is extracted and concatenated together for the result. For example, if f1.nw, f2.nw, and f3.nw all have a code chunk named "X", "notangle -RX f1.nw f2.nw f3.nw > X" will result in X from f1.nw, followed by X from f2.nw, followed by X from f3.nw. Normally, I combine the code from all .nw files in the same directory. However, for some things, ordering is important. Even in a makefile, the ordering of a particular variable's value may be important. The order given by just using *.nw is alphabetical and dependent on locale. So instead, I defined specially formatted comments in .nw files to give an explicit order. For example, it is possible to say that f3.nw should always precede f2.nw: "notangle -RX f1.nw f3.nw f2.nw". Now, if I wanted to make just one makefile using this ordering, I could do so, using "notangle -Rmakefile f1.nw f3.nw f2.nw". However, I don't want to have to remember the correct order, or type such a long line every time I need to change the makefile. Instead, I split the makefile into one which computes the correct order based on those special comments and direct \input statements, and then creates a more complete makefile by extracting include files. This is not intended to ever be expanded by other .nw files, so it can be extracted with the same command for every build directory: "notangle -t8 build.nw > makefile" (-t8 just forces 8-char tabs, required for GNU make, and no -R option is needed because the makefile is the default). I could have made just one include file, but instead I split the variable definitions from the rules so that they could use :=, which means they need to appear before their use. I also split out a few variables as "configuration variables", because I only support editing the config file rather than autoconf/cmake style configuration (which has been on my TODO list for 8 years, so it's not changing any time soon). I also split out the library rules, mainly because generating them requires actual code execution, rather than just notangle.

    Now, in the above explanation is the first reason I wanted properly supported include files: the config variables are supposed to be able affect the build order as well. I had to reorganize the makefile to accomodate makepp, so the config variables can no longer affect build order. This is actually no big loss, because although I wanted this to work, I have never actually *needed* it to work.

    You might wonder why I would need to create a makefile from multiple noweb inputs, but build.nw is really very basic. It only includes generic rules for C code and scripts. Even that assigns variables to code chunks that need to be expanded in further noweb files, to add more scripts to a list or more C executables or more libraries or whatever. More complex specific and generic rules can be added by other noweb files. For example, one of my extensions adds rules for automatically running gperf on a selected set of names, generating an enumeration header file and a lookup function. That particular set of rules triggered a bug in makepp, which I have just reported. The project I'm currently working on parses a set of text files and produces various static C tables, one file per table (so static linking only pulls in the needed tables). This is what prompted me to open this request; I wanted to be able to insert the object files into an already-defined library, but I couldn't think of any way that did not require reordering other, mostly unrelated things just to satisfy makepp. Instead, I just made a single rule that generates a different library out of just those object files, and don't bother building them via rules either, but instead use just one really long $(CC) -c command which is bound to fail on non-Linux systems. Basically:

    generated: dat-generator
    ./dat-generator
    gen_files=$(shell cat generated)
    libdat.a: generated
    $(CC) -c $(CFLAGS) $(gen_files)
    ar cr $@ $(gen_files:%.c=%.o)
    ranlib $@

    Since makepp supports multiple outputs, it would probably also be nice to add the list of generated files to the list of outputs from the first rule, but that's a minor side issue.

    As to the rules which do latex: while the bulk of build.nw is about doing the latex stuff (actually, syntax highlighting for latex and html), it's all relatively stable and has no special make-related needs, at least not the way I do it. Odin does it by building the document in a separate directory, automatically detecting/running makeindex, bibtex, and detecting changes in all aux files for reruns, but that's more complicated than it needs to be, and runs latex one more time than necessary.

    As to your comments regarding my comments on make -f-, and why I use it: let me repeat: I do not use it for makepp, and do not need it, either, because makepp will never fail due to a bad, existing include file that it will rebuild anyway. However, if this feature request were implemented the way gmake does, it might.

    Your statement that build order and makefile order are not related is not entirely true, nor relevant to what I want to do. What I want to do is change a variable or two for a rule which has already been defined. This is not possible if that rule is used to generate an include file, and the variable redefinition happens after the include directive. However, for any other rule, this works as it should.

    Ignoring #line directives might be a useful option, but your argument as to why they should not be ignored is completely valid. After all, if you're debugging, you probably want your #line directives to be correct. I wasn't complaining that #line directives should be ignored, but that I should've realized before I started that makepp wouldn't buy me much.

    Your statement that makepp does not create circular dependencies is wrong. Not only does it create circular dependencies, it does so without warning. The command parser is actually doing something wrong. I have opened a bug report (3594253). However, even if it weren't doing something wrong, it's possible that your include directive parser could accidentally pick up includes it shouldn't (for example, when using an unsupported C compiler, or using it in an unexpected manner), and your command parser may pick up commands that are meant to be taken from elsewhere rather than be built. Yes, I am aware that --no-path-executable-dependencies overrides the latter. Every single possible bug can probably never be fixed, especially if circular auto-deps are added silently (well, you can find them if you print the dep tree and examine it carefully, but that can be a lot of work, and even then circular deps are not as obvious as they should be).

     
  • New signature that ignores lines

     
    Attachments
  • Thanks for the long response, I'll have to think more about it!

    In the meanwhile, I had long felt the line stuff in signature C to be a nuisance, because few people actually need __LINE__. But now your tangle stuff opened my eyes that there exists a much stronger reason to ignore lines if you can. Wasn't too hard to do, it's attached, and can be activated with:

    override signature C_flow

     
  • I'm still thinking about the C_flow signature. I've got the algorithm finetuned now, but I don't see specifying it the sledgehammer way with override as the solution. It's more a general option to signature C thing, but we have no syntax to specify such options, and doing it with perl { $Mpp::Signature... = 1 } seems wrong...

    As for your main wish, I've been breaking my head how to keep book on modifications, so as do this reliably with backtracking. I should have checked gmake first -- as usual they do it unreliably too, i.e. if you assign variables with := and side effect, you get the side effect as many times as they need to reiterate the makefile :-( They should repeat only those side effects that depend on something changed due to the new include, but this kind of logic only exists for rules and files.

    It should be feasible to do the same for mpp, based on the makefile rebuilding/reloading code combined with the code for --defer-include. Alas mpp would have had stronger reasons to do it right, because &builtin commands at the makefile top-level are one feature and global variables are another. I can't just go back to the value before loading this makefile, because dependencies may have loaded other makefiles, which may also also have appended to the global var. So if appending is used, it would happen multiply.

    Anyway, for this gmake compatibility hope is on its way!

     
1 2 > >> (Page 1 of 2)