ctool-develop Mailing List for cTool Library
Brought to you by:
flisakow
You can subscribe to this list here.
2003 |
Jan
|
Feb
(1) |
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
(10) |
Aug
(23) |
Sep
(12) |
Oct
|
Nov
|
Dec
(2) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: <ben...@id...> - 2004-05-22 12:04:54
|
Dear Open Source developer I am doing a research project on "Fun and Software Development" in which I kindly invite you to participate. You will find the online survey under http://fasd.ethz.ch/qsf/. The questionnaire consists of 53 questions and you will need about 15 minutes to complete it. With the FASD project (Fun and Software Development) we want to define the motivational significance of fun when software developers decide to engage in Open Source projects. What is special about our research project is that a similar survey is planned with software developers in commercial firms. This procedure allows the immediate comparison between the involved individuals and the conditions of production of these two development models. Thus we hope to obtain substantial new insights to the phenomenon of Open Source Development. With many thanks for your participation, Benno Luthiger PS: The results of the survey will be published under http://www.isu.unizh.ch/fuehrung/blprojects/FASD/. We have set up the mailing list fa...@we... for this study. Please see http://fasd.ethz.ch/qsf/mailinglist_en.html for registration to this mailing list. _______________________________________________________________________ Benno Luthiger Swiss Federal Institute of Technology Zurich 8092 Zurich Mail: benno.luthiger(at)id.ethz.ch _______________________________________________________________________ |
From: Stefan S. <se...@sy...> - 2004-05-07 13:25:00
|
hi there, this is just a little note to let you know that I'v started to integrate the ctool into the synopsis framework (http://synopsis.fresco.org). The first step is done: ctool is built now as a python module, and the regression tests are integrated into synopsis' own testing framework. This means I can run all the original regression tests and monitor changes while doing further work. My own short term goal is to make the ctool back-end generate a synopsis AST to build reference documentation (see http://synopsis.fresco.org/docs/Manual/index.html for examples). But in the long term I'd also like to expose more of the power the ctool provides such as code generation and call graph inspection. If anybody on this list is still interested into the ctool and its evolution, now is the time to speak up ! Take care, Stefan |
From: Shaun F. <fli...@so...> - 2003-12-28 02:54:45
|
Kragen, Thanks, but it doesn't appear to me this program violates the cTool license. - it doesn't appear to me to be proprietary, its available as source code from source-forge. - The license file clearly states brcc is derived from cTool, and supplies full source for it, and its original license file. Also, I've recently been convinced to convert the license to LGPL for the benefit of Synopsis: http://synopsis.sourceforge.net/ I hadn't done any of the paperwork yet, cuz I'm a slacker - but now I've changed the cTool license on sourceforge to accurately reflect the current license information. The license for the original cTool package, (which didn't use the GPL) had an exemption for research / universities anyhow, since I personally know how difficult it can be to get a university to waive what its sees as its rights. Thanks, Shaun On Dec 22, 2003, at 7:14 PM, Kragen Sitaker wrote: > It looks like the good folks at brook.sf.net aka > http://graphics.stanford.edu/projects/brookgpu are distributing > proprietary software based on cTool. You might want to see what you > can do about that. > |
From: Stefan S. <se...@sy...> - 2003-12-05 15:53:25
|
hi there, I just agreed with Shaun to integrate the ctool code into synopsis (http://synopsis.fresco.org), as a C backend. Shaun agreed to relicense the code under LGPL, so it can be used in a framework integrated with GPL-incompatible parts (the rest of synopsis is already licensed under LGPL). As the ctool project itself doesn't seem to evolve any further at this point, I'd like to invite everybody who is interested in further development to join the synopsis project. Synopsis is a general code introspection tool with currently a focus on code documentation. Ctool will give it a new language binding, and stimulate a broader set of AST types so arbitrary code can be introspected ('statements'), not just 'declarations'. You can find a copy of the ctool repository at http://synopsis.fresco.org/viewcvs/Synopsis/Synopsis/Parsers/C/ctool/ where I'm going to restructure it a bit to fit better into the whole. I'm open to comments, criticism, and suggestions as to how to procede from here. Take care, Stefan |
From: Shaun F. <fli...@so...> - 2003-09-09 06:53:37
|
Hmm, I didn't comment it out, it must have be Daniel. Searching... - it looks like he commented it out in lexer.l v1.11. This was rather buggy, as it depended on the format of the #line directives to undo the include, the format of which seems to depend on the C preprocessor used. Dan, any comments? Thanks, Shaun On Monday, September 8, 2003, at 01:03 PM, Stefan Seefeld wrote: > > hi there, > > trying to kill the last three failed tests in accept0, I find > that the cause of the failure is that these three tests contain > include directives, which are not dealt with by ctool. While > the library contains some code to detect whether or not the > current statement is part of the main file or not, the code > to actually instantiate an 'InclStemnt' is commented out (#if 0...). > > What's the reasoning behind that ? Is that a work in progress ? > What can I do to get it working (again) ? > > Thanks, > Stefan > > > |
From: Stefan S. <se...@sy...> - 2003-09-08 00:30:53
|
Shaun Flisakowski wrote: > > On Friday, September 5, 2003, at 09:22 AM, Stefan Seefeld wrote: > >> >> Now 89.66% of the accept0 tests pass (yay !). The rest seems to stem >> from a parse error in /usr/include/stdio.h, specifically at a line like: >> >> extern char *tempnam (__const char *__dir, __const char *__pfx) >> __attribute__ ((__malloc__)); >> >> While the scanner supports '__attribute__', the grammar doesn't look >> as if it allows it at that particular place. As I don't quite >> undererstand the grammar (yet), I don't know what change to make >> to gram.y to add an optional attribute to a function declaration >> like the above. >> It might be a one-line fix... >> > > It looks to me like __attribute__ is already accepted at the proper spot > (after any declarator), in gram.y: > decl: declarator [opt_gcc_attrib] > > I would guess the problem might stem from lexer.l not having an entry > for __malloc__, you would need a line like the other supported flags > have, see __format__, __noreturn__ etc. while in the <GCC_ATTRIB> state. good guess ! I added the necessary code to support the '__malloc__' attribute and the error is gone. Thanks a lot ! I was confused because the error message reported a parse error *before* '__attribute__', but that's probably just because of the way gram.y is written. Anyways, I'm getting closer to 100% passing tests :-) Best regards, Stefan |
From: Shaun F. <fli...@so...> - 2003-09-07 20:59:47
|
On Friday, September 5, 2003, at 09:22 AM, Stefan Seefeld wrote: > > Now 89.66% of the accept0 tests pass (yay !). The rest seems to stem > from a parse error in /usr/include/stdio.h, specifically at a line > like: > > extern char *tempnam (__const char *__dir, __const char *__pfx) > __attribute__ ((__malloc__)); > > While the scanner supports '__attribute__', the grammar doesn't look > as if it allows it at that particular place. As I don't quite > undererstand the grammar (yet), I don't know what change to make > to gram.y to add an optional attribute to a function declaration > like the above. > It might be a one-line fix... > It looks to me like __attribute__ is already accepted at the proper spot (after any declarator), in gram.y: decl: declarator [opt_gcc_attrib] I would guess the problem might stem from lexer.l not having an entry for __malloc__, you would need a line like the other supported flags have, see __format__, __noreturn__ etc. while in the <GCC_ATTRIB> state. Thanks, Shaun |
From: Stefan S. <se...@sy...> - 2003-09-05 16:22:42
|
Shaun Flisakowski wrote: > There is code inside lexer.l that turns "__builtin_va_list" into an > ellipsis. There is code commented out there to turn it into an "int" > instead. If it was made a builtin type like INT (defined near the top > of gram.y) and had support inside printing, etc. it would work just like > it always existed. ok, I'v enabled the code that turns it into an 'INT' for now. I may add a new type later if it is needed. > __restrict doesn't appear to me to be a type, it looks like some sort of > storage class modifier like extern or auto. borrowing an idea from another (C++) parser I work with, I simply ignore '__restrict' tokens. Now 89.66% of the accept0 tests pass (yay !). The rest seems to stem from a parse error in /usr/include/stdio.h, specifically at a line like: extern char *tempnam (__const char *__dir, __const char *__pfx) __attribute__ ((__malloc__)); While the scanner supports '__attribute__', the grammar doesn't look as if it allows it at that particular place. As I don't quite undererstand the grammar (yet), I don't know what change to make to gram.y to add an optional attribute to a function declaration like the above. It might be a one-line fix... Regards, Stefan |
From: Shaun F. <fli...@so...> - 2003-09-05 10:10:49
|
On Thursday, September 4, 2003, at 10:43 AM, Stefan Seefeld wrote: >> I don't think we need to bother with a table of types, there are many >> gcc specifics which are not types, and still require handling, see >> the support for __attribute in lexer.l. I think gcc is popular >> enough these extensions may as well just be true built-ins in cTool. > > Fine. Does anybody have a reference to a document describing the > grammar > for these extensions ? > Sure, they are on the Gnu site: http://gcc.gnu.org/onlinedocs/gcc-3.3.1/gcc/ index.html#toc_C%20Extensions Shaun |
From: Stefan S. <se...@sy...> - 2003-09-04 17:44:21
|
Shaun Flisakowski wrote: > There is code inside lexer.l that turns "__builtin_va_list" into an > ellipsis. There is code commented out there to turn it into an "int" > instead. If it was made a builtin type like INT (defined near the top > of gram.y) and had support inside printing, etc. it would work just like > it always existed. Hmm, ok. It has to be a type, or else the typedef I cited yesterday wouldn't be valid. I don't have enough experience with this stuff to be able to make an educated suggestion as to how to represent that type. I fear that if we represent just that type (, i.e. '__builtin_va_list') it's too specific a solution. Other builtin types may follow. On the other hand, we could have a type token that uses a table lookup to find the real name of the type. But that doesn't work if the different types covered that way have different semantics w.r.t. the AST (and thus, gram.y). > __restrict doesn't appear to me to be a type, it looks like some sort of > storage class modifier like extern or auto. yep, that's the impression I get as well. > I don't think we need to bother with a table of types, there are many > gcc specifics which are not types, and still require handling, see the > support for __attribute in lexer.l. I think gcc is popular enough these > extensions may as well just be true built-ins in cTool. Fine. Does anybody have a reference to a document describing the grammar for these extensions ? Regards, Stefan |
From: Shaun F. <fli...@so...> - 2003-09-04 16:27:11
|
If this is legal C99 we should just support it directly. Please remove the warning. Thanks, Shaun On Wednesday, September 3, 2003, at 04:26 PM, Stefan Seefeld wrote: > Shaun Flisakowski wrote: > >> Empty arrays are not allowed in Ansi C in general, it does seem like >> that should probably be a warning rather than an error though, since >> K&R C was lax about that. > > I found this in <cdefs.h> on my machine: > > /* Support for flexible arrays. */ > #if __GNUC_PREREQ (2,97) > /* GCC 2.97 supports C99 flexible array members. */ > # define __flexarr [] > #else > # ifdef __GNUC__ > # define __flexarr [0] > # else > # if defined __STDC_VERSION__ && __STDC_VERSION__ >= 199901L > # define __flexarr [] > # else > /* Some other non-C99 compiler. Approximate with [1]. */ > # define __flexarr [1] > # endif > # endif > #endif > > and the line the ctdemo is stumbling over is: > > __extension__ struct __gconv_step_data __data __flexarr; > > I think the 'right thing to do' would be to provide some > form of a config file that sets individual parser behaviors > such as whether or not to accept 'flexible arrays'. > > A quick fix would be to just take the error condition > out of gram.y. This flexible array thing seems to occure > in almost every file I parse as it's part of some very > fundamental system headers on my machine. > > Regards, > Stefan > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Ctool-develop mailing list > Cto...@li... > https://lists.sourceforge.net/lists/listinfo/ctool-develop > |
From: Shaun F. <fli...@so...> - 2003-09-04 16:23:42
|
Stefan, There is code inside lexer.l that turns "__builtin_va_list" into an ellipsis. There is code commented out there to turn it into an "int" instead. If it was made a builtin type like INT (defined near the top of gram.y) and had support inside printing, etc. it would work just like it always existed. __restrict doesn't appear to me to be a type, it looks like some sort of storage class modifier like extern or auto. I don't think we need to bother with a table of types, there are many gcc specifics which are not types, and still require handling, see the support for __attribute in lexer.l. I think gcc is popular enough these extensions may as well just be true built-ins in cTool. Thanks, Shaun On Wednesday, September 3, 2003, at 05:23 PM, Stefan Seefeld wrote: > hi there, > > I'm running into a series of other problems, which seem > at least partly to be gcc-specific: > > I see lines of the form: > > > typedef __builtin_va_list __gnuc_va_list; > > without any include directive including a file defining > '__builtin_va_list'. (of course not ! It's 'builtin'.) > > How can this be supported ? Should the ctool code set > up some internal table of predefined 'builtin' types / > identifiers ? > > Further, I'm seeing lots of declarations of the form: > > extern int _IO_vfscanf (_IO_FILE * __restrict, const char * __restrict, > _IO_va_list, int *__restrict) __THROW; > > i.e. with the keyword '__restrict'. This keyword isn't > recognized by the ctool parser either, so it reports > an error about 'duplicate parameter names'. > (In contrast to the first problem, the '__restrict' > keyword doesn't seem to be gcc-specific) > > Any suggestion how to fix this ? > > Thanks, > Stefan > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Ctool-develop mailing list > Cto...@li... > https://lists.sourceforge.net/lists/listinfo/ctool-develop > |
From: Stefan S. <se...@sy...> - 2003-09-04 00:25:31
|
hi there, I'm running into a series of other problems, which seem at least partly to be gcc-specific: I see lines of the form: typedef __builtin_va_list __gnuc_va_list; without any include directive including a file defining '__builtin_va_list'. (of course not ! It's 'builtin'.) How can this be supported ? Should the ctool code set up some internal table of predefined 'builtin' types / identifiers ? Further, I'm seeing lots of declarations of the form: extern int _IO_vfscanf (_IO_FILE * __restrict, const char * __restrict, _IO_va_list, int *__restrict) __THROW; i.e. with the keyword '__restrict'. This keyword isn't recognized by the ctool parser either, so it reports an error about 'duplicate parameter names'. (In contrast to the first problem, the '__restrict' keyword doesn't seem to be gcc-specific) Any suggestion how to fix this ? Thanks, Stefan |
From: Stefan S. <se...@sy...> - 2003-09-03 23:28:54
|
Shaun Flisakowski wrote: > Empty arrays are not allowed in Ansi C in general, it does seem like > that should probably be a warning rather than an error though, since K&R > C was lax about that. I found this in <cdefs.h> on my machine: /* Support for flexible arrays. */ #if __GNUC_PREREQ (2,97) /* GCC 2.97 supports C99 flexible array members. */ # define __flexarr [] #else # ifdef __GNUC__ # define __flexarr [0] # else # if defined __STDC_VERSION__ && __STDC_VERSION__ >= 199901L # define __flexarr [] # else /* Some other non-C99 compiler. Approximate with [1]. */ # define __flexarr [1] # endif # endif #endif and the line the ctdemo is stumbling over is: __extension__ struct __gconv_step_data __data __flexarr; I think the 'right thing to do' would be to provide some form of a config file that sets individual parser behaviors such as whether or not to accept 'flexible arrays'. A quick fix would be to just take the error condition out of gram.y. This flexible array thing seems to occure in almost every file I parse as it's part of some very fundamental system headers on my machine. Regards, Stefan |
From: Shaun F. <fli...@so...> - 2003-09-03 16:53:20
|
Hi Stefan, I recently turned off the labels and extra debug output for the regression tests - I didn't bother fixing the gcc suite as it is not yet working. Empty arrays are not allowed in Ansi C in general, it does seem like that should probably be a warning rather than an error though, since K&R C was lax about that. The STDC thing is injected into the preprocessor on the line that calls cpp from project.cpp, some systems don't include it as yours apparently does, but it needs to be on to get many include files to function. So, to sum up, don't worry about the gcc tests, they didn't run well before. :^) Thanks, Shaun On Wednesday, September 3, 2003, at 05:45 AM, Stefan Seefeld wrote: > hi there, > > I'm looking into the regression testing as I want > to get that up and running again as a precondition > for any non-trivial code changes. Specifically, I'm > looking into the gcc test suite. > > Trying to compile the 'attrib' test, I run into > various problems: > > * the file containing the expected output (run/attrib.out) > contains a lot more than the output generated by > the actual test run ('../../src/ctdemo run/attrib.c'), > i.e. symbol, tag, and label listings. It seems the > command used by the run_suite script isn't quite > complete. Adding '-d' will add most of the missing > stuff, though the listed names are mangled somehow, > they appear to contain the location in the source file > or something like that... > > * the run/attrib.c file compiled with my gcc (3.2.2) > generates a warning: > > run/attrib.c:14: Warning - old-style declaration or incorrect type: > __attribute__ ((__format__ (printf, 2, 3))); > > Is that a recent addition to gcc ? Should that be fixed > in the ctool ? Or is there something else that can be > done to make this warning go away ? > > The same warning is issued at various places with other > source files from that test suite as well. > > * /usr/include/gconf.h generates an error message: > > /usr/include/gconv.h:176: Unsized array not allowed as field: > __extension__ struct __gconv_step_data __data []; > > Unfortunately I'm not a C expert, so I can't really comment > on this, other than that I see the line in gram.y that issues this > message. I'v no idea whether that's a gcc extension, or generally > how to deal with that. > > * once again the question concerning 'warning: "__STDC__" redefined': > what's causing this ? Does the ctdemo define and inject that macro > into the preprocessor ? It seems to be defined by my system (either > compiler or system headers or a combination thereof). > > I would hugely appreciate some help on getting this stuff up and > running, > it would make it much easier to validate future code changes... > > Thanks a lot ! > > Stefan > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Ctool-develop mailing list > Cto...@li... > https://lists.sourceforge.net/lists/listinfo/ctool-develop > |
From: Stefan S. <se...@sy...> - 2003-09-03 12:47:44
|
hi there, I'm looking into the regression testing as I want to get that up and running again as a precondition for any non-trivial code changes. Specifically, I'm looking into the gcc test suite. Trying to compile the 'attrib' test, I run into various problems: * the file containing the expected output (run/attrib.out) contains a lot more than the output generated by the actual test run ('../../src/ctdemo run/attrib.c'), i.e. symbol, tag, and label listings. It seems the command used by the run_suite script isn't quite complete. Adding '-d' will add most of the missing stuff, though the listed names are mangled somehow, they appear to contain the location in the source file or something like that... * the run/attrib.c file compiled with my gcc (3.2.2) generates a warning: run/attrib.c:14: Warning - old-style declaration or incorrect type: __attribute__ ((__format__ (printf, 2, 3))); Is that a recent addition to gcc ? Should that be fixed in the ctool ? Or is there something else that can be done to make this warning go away ? The same warning is issued at various places with other source files from that test suite as well. * /usr/include/gconf.h generates an error message: /usr/include/gconv.h:176: Unsized array not allowed as field: __extension__ struct __gconv_step_data __data []; Unfortunately I'm not a C expert, so I can't really comment on this, other than that I see the line in gram.y that issues this message. I'v no idea whether that's a gcc extension, or generally how to deal with that. * once again the question concerning 'warning: "__STDC__" redefined': what's causing this ? Does the ctdemo define and inject that macro into the preprocessor ? It seems to be defined by my system (either compiler or system headers or a combination thereof). I would hugely appreciate some help on getting this stuff up and running, it would make it much easier to validate future code changes... Thanks a lot ! Stefan |
From: Stefan S. <se...@sy...> - 2003-08-22 09:40:37
|
hi there, I'm looking at the project.cc file, specifically the Project::parse function. I'd like to reimplement this method using two new methods - preprocess and parse - such that the old parse method becomes obsolete and is only provided for backward compatibility. I'm now in the process to implement the 'preprocess' method, and I'm looking at the old code for inspiration. I'm hesitant to just cut&paste the code I see because I don't understand it. Or better, it looks quite suspicious, and I'm wondering whether anybody is actually using it. Examples: * the default 'cd_cmd' parameter is "cd % ;", but that string, taken as a format string for 'sprintf' doesn't do anything, i.e. I guess it's a typo and it actually should read "cd %s ;". * the ctdemo passes 'NULL' for the cd_cmd if it isn't explicitely specified per '-cdcmmd' argument. What happens, if 'sprintf' receives a NULL format string ?? * I see code such as if (use_cpp) { if (cpp_outputfile) strcpy(cpp_file, cpp_outputfile); if (cpp_dir) sprintf(cpp_cmmd, cd_cmd, cpp_dir); if (cpp_cmd) sprintf(cpp_cmmd, cpp_cmd, path, cpp_file); else ... which means the 'cpp_cmmd' variable is overridden each time 'sprintf' is called. I'm obviously something missing here, though I don't see what. Or is it simply that there is lots of redundant code, i.e. this code gets never used, and so the 'bugs' never surface ? I'm tempted to just reimplement the 'preprocess' method and then use that instead. Any comments ? Best regards, Stefan |
From: Stefan S. <se...@sy...> - 2003-08-19 05:36:31
|
Stefan Seefeld wrote: > * the parsing should be separate from preprocessing, such that > users can plug in their own preprocessors. right now the project.cpp file contains definitions used to call the preprocessor. It's the 'Project::parse' method that uses these settings, if the 'cpp_cmd' argument is NULL. Is that really a good idea ? I'd suggest we require this argument to be something non-zero, so the caller of Project::parse has to care for the preprocessor that is to be called. That way we could remove lots of macros and defines from the src/ build instructions, and only the frontend (i.e. the 'Project::parse' caller) would be responsible for it. What do you think ? On a slightly different (but related) note: I'd really like to get the tests up and running on my system before I make any important changes to the actual code (as opposed to the build system infrastructure), such that I can see immediately whether my changes incure regressions. Unfortunately, the vast majority of the test don't pass. Anything we can do about this situation ? Regards, Stefan |
From: Stefan S. <se...@sy...> - 2003-08-19 05:25:50
|
hi again, here are some more updates on the build system evolution: Stefan Seefeld wrote: > * 'make install' should place the headers in <prefix>/include/ctool/ > and the library in <prefix>/lib such that other programs can > detect them there. the ctdemo tool is now in examples/ctdemo, and a top-level 'Makefile' will traverse src/ to only build the libs (.a and .so), and then step into examples/ (and recursively examples/ctdemo/) to build executables. The libs are built into lib/, ctdemo into bin/. 'make install' installs the libs into $(prefix)/lib, and the headers into $(prefix)/include/ctool. Are there other examples to build beside ctdemo ? Should any of them get installed ? Also, it would be nice if the test executions would be integrated into the build system, i.e. 'make test' in the toplevel directory should traverse the 'regression/' subdir (should we rename that 'test/' ?) and do all the necessary things. Oh, and all this works well also if the build tree is outside the source tree. Here are the required steps: $ cd ctool $ cvs update $ ./autogen.sh $ mkdir ../ctool-build $ cd ../ctool-build $ ../ctool/configure $ make All this new build system stuff still needs some refinement, but the basic infrastructure is in place. I hope to get this new build system up to speed with everybody's help so it can replace the old Makefile(s) in the not-so-distant future. Regards, Stefan PS: now it would really be great if there were any docs to build, i.e. 'make doc'... <wink/> |
From: Stefan S. <se...@sy...> - 2003-08-19 01:44:08
|
Stefan Seefeld wrote: > * the headers shouldn't be accessed directly like <decl.hpp>, > but instead with a common prefix, such as <ctool/decl.hpp> this part is done. I sent a little file layout change request to the sourceforge staff and today they processed the request. The headers are now in include/ctool, so I made the necessary adjustments to the #include statements as well as the Makefile to use the new location. Regards, Stefan |
From: Shaun F. <fli...@so...> - 2003-08-11 16:18:02
|
These changes all sound good to me. The most worrisome one in the namespaces. We did this with our (large) library at work, finding out that some compilers (VC7, for example) have some odd namespace bugs. Still, I think its worth doing, but it should probably be controllable via a new define in the Makefile. I'm less concerned about users having to modify their Makefile a little, that' s not nearly as much a problem as having to change their source code. Thanks, Shaun On Saturday, August 9, 2003, at 12:41 PM, Stefan Seefeld wrote: > hi there, > > I'm looking at the ctool build system and code and I wonder > what is needed to compile/install and user the API. > > There are some small but important changes I'd like to suggest > that would make it much more simple to use ctool as a library: > > * the headers shouldn't be accessed directly like <decl.hpp>, > but instead with a common prefix, such as <ctool/decl.hpp> > > * all declarations should be wrapped in a common namespace > (such as, surprize, 'ctool'). > > * 'make install' should place the headers in <prefix>/include/ctool/ > and the library in <prefix>/lib such that other programs can > detect them there. > > * the parsing should be separate from preprocessing, such that > users can plug in their own preprocessors. > > I realize that these are non-trivial changes, i.e. they break > backward compatibility. On the other hand, there are easy ways > to provide backward compatible hacks for those who need them. > > What do you think of these changes ? They are technically quite > simple to accomplish, it's just a matter of determining the impact > they'll have, and providing the right means for backward compatibility. > > Best regards, > Stefan > > > > ------------------------------------------------------- > This SF.Net email sponsored by: Free pre-built ASP.NET sites including > Data Reports, E-commerce, Portals, and Forums are available now. > Download today and enter to win an XBOX or Visual Studio .NET. > http://aspnet.click-url.com/go/psa00100003ave/ > direct;at.aspnet_072303_01/01 > _______________________________________________ > Ctool-develop mailing list > Cto...@li... > https://lists.sourceforge.net/lists/listinfo/ctool-develop > |
From: Stefan S. <se...@sy...> - 2003-08-10 15:15:25
|
hi there, I committed some new files for a new build system. Right now that consists of a set of files needed for autoconf / configure, as well as a single Makefile template ('Makefile.in'), which may eventually replace src/Makefile. To try it out, step into the root directory and call 'autoconf' to generate the 'configure' script. Then run './configure' to generate the new src/Makefile. Finally, compile by running 'make -C src'. I'd like to work on this further to be able to do a full build (the library, some tools, tests, etc.) with a single command ('make test', say), and that on build directories that are *outside* the src directory. That would especially be practical if you want to make changes and test them immediately on different platforms / configurations. Anyways, before pushing that idea further I need to refine the configure tests to be fully portable, i.e. allow to detect different versions of bison/yacc, preprocessor options, etc. Best regards, Stefan |
From: Stefan S. <se...@sy...> - 2003-08-09 19:43:39
|
hi there, I'm looking at the ctool build system and code and I wonder what is needed to compile/install and user the API. There are some small but important changes I'd like to suggest that would make it much more simple to use ctool as a library: * the headers shouldn't be accessed directly like <decl.hpp>, but instead with a common prefix, such as <ctool/decl.hpp> * all declarations should be wrapped in a common namespace (such as, surprize, 'ctool'). * 'make install' should place the headers in <prefix>/include/ctool/ and the library in <prefix>/lib such that other programs can detect them there. * the parsing should be separate from preprocessing, such that users can plug in their own preprocessors. I realize that these are non-trivial changes, i.e. they break backward compatibility. On the other hand, there are easy ways to provide backward compatible hacks for those who need them. What do you think of these changes ? They are technically quite simple to accomplish, it's just a matter of determining the impact they'll have, and providing the right means for backward compatibility. Best regards, Stefan |
From: Stefan S. <se...@sy...> - 2003-08-09 19:32:53
|
Steven Singer wrote: > Shaun Flisakowski wrote: > >> It sounds to me like we're basically in agreement here. My plan going >> forward is to, for example, take the time to add accessors to all >> classes I work on, without making the members private. So, the old >> API continues to work while a better one is available. > > > I'm not convinced that accessors gain us much. The synopsis AST API (http://synopsis.sf.net) uses accessors for all types to access the members. Yet I'm pondering to replace them all with simple (public) variables. Of course, we are talking python here, so the object model isn't quite the same as in C++, but the basic idea is similar: The AST nodes seem to me to be basically data, with no or minimal behavior. Especially by means of the visitor pattern all behavior could be factored out into external classes, so the only bit of polymorphism that has to remain in the AST nodes is the dispatching mechanism, i.e. the 'accept()' implementation. By the way, my main interest into ctool is to hook the AST up with synopsis. Synopsis was originally developed as a source code documentation tool similar to doxygen, but it is quite a bit more powerful. For example another developer added cross reference support to it similar to lxr (http://lxr.linux.no/) so you can inspect not only declarations, but any expressions. Right now this tool (we call it sxr) is only available with the C++ parser, but I'd like to provide the same with C (using ctool). For examples, have a look at http://synopsis.sourceforge.net/demo/index.html... I wonder what you think of pushing that idea even further, i.e. making synopsis (python) modules that are basically scripting frontends to the ctool library as a whole, i.e. also stuff like call graph inspection. Regards, Stefan |
From: Steven S. <ste...@cs...> - 2003-08-04 12:14:17
|
Shaun Flisakowski wrote: > It sounds to me like we're basically in agreement here. My plan going > forward is to, for example, take the time to add accessors to all > classes I work on, without making the members private. So, the old API > continues to work while a better one is available. I'm not convinced that accessors gain us much. For example, I'm not sure that the Statement class should include a next pointer at all. I think that a Statement should be just a single statement and that there should be a StatementList class. That sort of change will break the API in a manner that can't be fixed by accessor functions (or would leave a permanent wart on the side of the code). The more I think about this, the more I think we should just design an API from scratch and intially offer that as a second clean interface. This is an improvement over making minor changes to the existing API as it would get rid of all the intermediate stages where we had APIs that were neither the new or the old API. We could make a little playpen where we're free to add and remove parameters, functions or even entire classes wholesale without annoying anyone. We can also do minor bits of tidying up which make the new API neater but would be completely unjustified if backward compatability needed to be maintained. For example, spelling could be made consistent (InclStemnt could become IncludeStatement). When we finish playing and have a design we're happy with, we can then worry about migrating users. I think we should think about this some more before making any changes. We can let the design go through several iterations before we even think about coding it (or coding any more than example framgents). I have some ideas I need to think through. I suspect I'll reject a lot of them. > Yes, I've been a bit short on time and the necessary motivation lately > to do cleanup like this. I was also having some difficultly with the > early (Mac) versions of gcc 3.0 that led me to spend more time on my > Metreowerks-using projects. :^) I know the feeling. When things work, there isn't a real incentive to tidy them. There are always more important projects to work on (like the work I'm being paid for). > I think the next thing needing doing has to be improved handling of gcc > extensions, which is what I was working on during my last major ctool > development push. Not being able to cleanly use standard include files > is quite the bummer. Agreed. This is more important. Also, this work is independent of the API changes. - Steven -- ********************************************************************** The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer. ********************************************************************** |