Re: [TuxKart-devel] configure dir
Status: Alpha
Brought to you by:
sjbaker
From: Steve B. <sjb...@ai...> - 2000-07-02 09:20:27
|
Bram Stolk wrote: > > Steve Baker wrote: > > > > Bram Stolk wrote: > > > > > A remark: > > > autocont/automake suggests doing the configure stage in a directory > > > other than the distrib dir. Ah! Now I understand - you just want the binaries somewhere separate from the sources...and the proposed way to do that is to run the configure file from the distribution tree in an empty directory heirarchy so that the object code ends up there instead of all mixed up in the source tree (something which annoys me no end with autoconf/automake). > But I think the author forgets the main reason: > > If I want to build for three architectures (intel,alpha,ppc), and for each > architecture Yep - I agree 100% - we have this problem all the time at work where we have to build the same code for three different IRIX varients, a Solaris machine and a bunch of Linux boxes - some with Alpha's and some with Pentiums. The problem here is that it requires the GNU version of make to do this - and MANY systems don't have that installed as the default...or at all for that matter. There are also numerous 'pmake', 'smake' and 'NMAKE' varients with certain advantages to each...pmake and smake can do cool parallel makes on machines like our big ONYX boxes that have 40+ CPU's. GNU make gives up and/or crashes when you do that. What we did was to write a preprocessor for our Makefiles that takes a special 'XMakefile' - runs it through cpp and a number of other tools and spits out a normal 'Makefile' - which it then passes to whatever underlying make there is. Then in each source directory, we have daughter directories named after each machine architecture and the makefile mangler manages to get the '.o' and '.a' files to disappear into those daughter directories by stuffing '-o' directives onto compile lines and various other subterfuges. For our own purposes, this works quite well and with 'make' aliased to 'xmake', nobody is even very much aware of the intrusion. Regenerating the Makefile before each 'make' has the advantage that we can hide the annoying differences between 'make' implementations. However, that's clearly not a solution for mass distribution. > Heck, you could even have a Mesa-build and a nVidia OGL build coexist on your > system, > sharing the same sources. Now don't tell me that this does not appeal to you :-) erm - sorry: "That does not appeal to me". :-) The reason it doesn't is that now that we have the "OpenGL ABI for Linux" stuff nailed down, I don't *need* separate binaries for nVidia and Mesa. Here at home, I have one machine with an nVidia GeForce and another with a Voodoo-3 2000 - and they share a NFS partition on my home fileserver. Hence, the exact same binaries run on both machines - and I can compile on either one of them and run on the other. > It really must. Nope! > But as I've said, the numbers tell you it's not worth the effort. So please > ignore my pedantic ramblings, and enjoy the success of yet another steve > and oliver baker tux game :-) > I just hope you agree with me now, that the illicit reasons do exist. Actually, I do want to do things right - just so long as it doesn't screw things up for the majority of users - and now I understand what you are talking about, I whole-heartedly agree as to the need. However, I'm pretty much ignorant of the workings of autoconf/automake and the manuals for them are *appaling* to read. If you can guide me as to what I need to do, I'll be happy to do it just so long as Joe Q Public can still type ./configure;make;make install and have it "just work". -- Steve Baker HomeEmail: <sjb...@ai...> WorkEmail: <sj...@li...> HomePage : http://web2.airmail.net/sjbaker1 Projects : http://plib.sourceforge.net http://tuxaqfh.sourceforge.net http://tuxkart.sourceforge.net http://prettypoly.sourceforge.net |