Thread: [pygccxml-development] Indexing suite exception
Brought to you by:
mbaas,
roman_yakovenko
From: Matthias B. <ba...@ir...> - 2006-06-19 08:04:29
|
Hi, after being too busy the last couple of weeks to check out the new pyplusplus stuff (and move the experimental stuff to the contrib directory) I finally did an update on the code and now I get the below exception when I try to create the Maya bindings. My script is still unchanged and has worked with previous versions. I also haven't specified anything that has to do with the indexing suite, so I would have expected not to trigger anything that is related to this feature. Is this a bug in pyplusplus or do I have to specify anything related to this in my driver script? - Matthias - Traceback (most recent call last): File "pypp_setup.py", line 1174, in ? mod.writeModule() File "...pyplusplus/experimental/pypp_api.py", line 456, in writeModule mfs.write(write_main=multiCreateMain) File "...pyplusplus/file_writers/multiple_files.py", line 253, in write map( self.split_class, class_creators ) File "...pyplusplus/file_writers/multiple_files.py", line 171, in split_class self.__split_class_impl( class_creator ) File "...pyplusplus/file_writers/multiple_files.py", line 148, in __split_class_impl , [class_creator] )) File "...pyplusplus/file_writers/multiple_files.py", line 132, in create_source answer.append( code_creators.code_creator_t.indent( creator.create() ) ) File "...pyplusplus/code_creators/code_creator.py", line 93, in create code = self._create_impl() File "...pyplusplus/code_creators/class_declaration.py", line 308, in _create_impl return self._generate_code_no_scope() File "...pyplusplus/code_creators/class_declaration.py", line 226, in _generate_code_no_scope class_constructor, used_init = self._generate_constructor() File "...pyplusplus/code_creators/class_declaration.py", line 210, in _generate_constructor elif self.declaration.indexing_suite: File "...pyplusplus/decl_wrappers/class_wrapper.py", line 108, in _get_indexing_suite if self._indexing_suite is None: AttributeError: 'class_t' object has no attribute '_indexing_suite' |
From: Roman Y. <rom...@gm...> - 2006-06-19 08:52:19
|
On 6/19/06, Matthias Baas <ba...@ir...> wrote: > Hi, Good morning. Good to hear from you again. > after being too busy the last couple of weeks to check out the new > pyplusplus stuff (and move the experimental stuff to the contrib > directory) I finally did an update on the code and now I get the below > exception when I try to create the Maya bindings. My script is still > unchanged and has worked with previous versions. I also haven't > specified anything that has to do with the indexing suite, so I would > have expected not to trigger anything that is related to this feature. > > Is this a bug in pyplusplus or do I have to specify anything related to > this in my driver script? Cache !!! :-(((( We need to implement some mechanizm that will prevent this in future. > - Matthias - -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Matthias B. <ba...@ir...> - 2006-06-19 15:04:32
|
Roman Yakovenko wrote: >> Is this a bug in pyplusplus or do I have to specify anything related to >> this in my driver script? > > Cache !!! :-(((( We need to implement some mechanizm that will prevent > this in future. Argh, yes, sorry! (maybe it would be possible to store some sort of signature of the classes (such as the number of attributes) and compare this signature with the signature of the imported classes....?) Now the script runs fine but there's an error during compilation. What puzzles me is that the methods that seem to trigger the error have not been wrapped with the previous version of pyplusplus (even if they should have been, I was not ignoring them). For example, it's a method like this: bool MFnMesh::closestIntersection ( const MFloatPoint & raySource, const MFloatVector & rayDirection, const MIntArray * faceIds, const MIntArray * triIds, bool idsSorted, MSpace::Space space, float maxParam, bool testBothDirections, MMeshIsectAccelParams *accelerator, MFloatPoint & hitPoint, float* hitRayParam, int* hitFace, int* hitTriangle, float* hitBary1, float* hitBary2, float tolerance = 1e-6, MStatus * ReturnStatus = NULL ) which is now turned into the following code: MFnMesh_exposer.def("closestIntersection" , &::MFnMesh::closestIntersection , ( bp::arg("raySource"), bp::arg("rayDirection"), bp::arg("faceIds"), bp::arg("triIds"), bp::arg("idsSorted"), bp::arg("space"), bp::arg("maxParam"), bp::arg("testBothDirections"), bp::arg("accelerator"), bp::arg("hitPoint"), bp::arg("hitRayParam"), bp::arg("hitFace"), bp::arg("hitTriangle"), bp::arg("hitBary1"), bp::arg("hitBary2"), bp::arg("tolerance")=9.99999999999999954748111825886258685613938723691e-7, bp::arg("ReturnStatus")=bp::object() ) , bp::default_call_policies() ); Compiling this results in a lengthy and somewhat difficult to read error message. The actual error goes something like this: /sw/i386_linux-2.0_glibc2/boost-1.33.1/include/boost-1_33_1/boost/python/class.hpp:536: error: no matching function for call to `get_signature(bool (MFnMesh::*&)(const MFloatPoint&, const MFloatVector&, const MIntArray*, const MIntArray*, bool, MSpace::Space, float, bool, MMeshIsectAccelParams*, MFloatPoint&, float*, int*, int*, float*, float*, float, MStatus*), MFnMesh*)' /sw/i386_linux-2.0_glibc2/boost-1.33.1/include/boost-1_33_1/boost/python/class.hpp: In member function `void boost::python::class_<T, X1, X2, X3>::def_impl(T*, const char*, Fn, const Helper&, ...) [with T = MFnMesh, Fn = bool (MFnMesh::*)(const MFloatPoint&, const MFloatVector&, const MIntArray*, const MIntArray*, bool, MSpace::Space, float, bool, MMeshIsectAccelParams*, bool, MFloatPointArray&, MFloatArray*, MIntArray*, MIntArray*, MFloatArray*, MFloatArray*, float, MStatus*), Helper = boost::python::detail::def_helper<boost::python::detail::keywords<18>, boost::python::default_call_policies, boost::python::detail::not_specified, boost::python::detail::not_specified>, W = MFnMesh_wrapper, X1 = boost::python::bases<MFnDagNode, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_, mpl_::void_>, X2 = boost::python::detail::not_specified, X3 = boost::python::detail::not_specified]': So what is wrong with this method? (are there too many arguments? are there some required policies missing?) And another question is why was this ignored in previous versions? Which modification of pyplusplus has changed that? - Matthias - |
From: Roman Y. <rom...@gm...> - 2006-06-19 17:31:47
|
On 6/19/06, Matthias Baas <ba...@ir...> wrote: > > So what is wrong with this method? (are there too many arguments? are > there some required policies missing?) > And another question is why was this ignored in previous versions? Which > modification of pyplusplus has changed that? This should explain to you: http://svn.sourceforge.net/viewcvs.cgi/pygccxml/pyplusplus_dev/pyplusplus/decl_wrappers/calldef_wrapper.py?r1=195&r2=214 -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Matthias B. <ba...@ir...> - 2006-06-20 13:03:18
|
Roman Yakovenko wrote: > On 6/19/06, Matthias Baas <ba...@ir...> wrote: >> >> So what is wrong with this method? (are there too many arguments? are >> there some required policies missing?) >> And another question is why was this ignored in previous versions? Which >> modification of pyplusplus has changed that? > > This should explain to you: > > http://svn.sourceforge.net/viewcvs.cgi/pygccxml/pyplusplus_dev/pyplusplus/decl_wrappers/calldef_wrapper.py?r1=195&r2=214 Ah, I see. Sorry, actually there was that warning message but it got drowned in the output, so I didn't notice it. But as pyplusplus already knows that a particular member won't compile, shouldn't it then refuse to write it to the output file (as it was the case before)? Or even abort or something so that the user definitely knows that some action is required on his side? In the code the maximum number (10) is hard-coded, shouldn't that be at least be user-settable (or, if possible, automatically determined). What if the user actually did increase BOOST_PYTHON_MAX_ARITY? - Matthias - |
From: Allen B. <al...@vr...> - 2006-06-20 13:19:10
|
Matthias Baas wrote: >Roman Yakovenko wrote: > > >>On 6/19/06, Matthias Baas <ba...@ir...> wrote: >> >> >>>So what is wrong with this method? (are there too many arguments? are >>>there some required policies missing?) >>>And another question is why was this ignored in previous versions? Which >>>modification of pyplusplus has changed that? >>> >>> >>This should explain to you: >> >>http://svn.sourceforge.net/viewcvs.cgi/pygccxml/pyplusplus_dev/pyplusplus/decl_wrappers/calldef_wrapper.py?r1=195&r2=214 >> >> > >Ah, I see. Sorry, actually there was that warning message but it got >drowned in the output, so I didn't notice it. > >But as pyplusplus already knows that a particular member won't compile, >shouldn't it then refuse to write it to the output file (as it was the >case before)? Or even abort or something so that the user definitely >knows that some action is required on his side? > > It would also be helpful if the message contained the name of the method so the user knew which method was causing the problem. -Allen >In the code the maximum number (10) is hard-coded, shouldn't that be at >least be user-settable (or, if possible, automatically determined). What >if the user actually did increase BOOST_PYTHON_MAX_ARITY? > >- Matthias - > > >_______________________________________________ >pygccxml-development mailing list >pyg...@li... >https://lists.sourceforge.net/lists/listinfo/pygccxml-development > > > |
From: Roman Y. <rom...@gm...> - 2006-06-20 13:35:31
|
On 6/20/06, Allen Bierbaum <al...@vr...> wrote: > It would also be helpful if the message contained the name of the method > so the user knew which method was causing the problem. You are right, fixed and committed -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Roman Y. <rom...@gm...> - 2006-06-20 13:20:20
|
On 6/20/06, Matthias Baas <ba...@ir...> wrote: > Ah, I see. Sorry, actually there was that warning message but it got > drowned in the output, so I didn't notice it. I am aware of the this problem ( lot of output ), so next code will write only important things: import logging from pyplusplus import module_builder module_builder.set_logger_level( logging.INFO ) > But as pyplusplus already knows that a particular member won't compile, > shouldn't it then refuse to write it to the output file (as it was the > case before)? Or even abort or something so that the user definitely > knows that some action is required on his side? Now when you have bad experience( and I am sorry for that ) you can give an advice. In my opinion pyplusplus did the job, but of course I could be wrong. > In the code the maximum number (10) is hard-coded, shouldn't that be at > least be user-settable Yes, I fixed and committed. > (or, if possible, automatically determined). It does not worth that, too much work ( make files, auto configuration, scons, bjam, prj .... ) > What > if the user actually did increase BOOST_PYTHON_MAX_ARITY? Then he will update decl_wrappers.calldef_t.BOOST_PYTHON_MAX_ARITY to the actual number and will get rid of warning :-) -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Matthias B. <ba...@ir...> - 2006-06-21 09:41:35
|
Roman Yakovenko wrote: > I am aware of the this problem ( lot of output ), so next code will > write only important things: > > import logging > from pyplusplus import module_builder > module_builder.set_logger_level( logging.INFO ) I have that already in my script, but I still get a line for each individual file that is written. Maybe these messages should be debug messages instead of info messages (I think a summary of the time it took to write the files and maybe how many files were actually updated would be enough in the standard case). (But apart from that there is still a lot of output from my own script that would drown the message anyway) >> But as pyplusplus already knows that a particular member won't compile, >> shouldn't it then refuse to write it to the output file (as it was the >> case before)? Or even abort or something so that the user definitely >> knows that some action is required on his side? > > Now when you have bad experience( and I am sorry for that ) you can > give an advice. > In my opinion pyplusplus did the job, but of course I could be wrong. Well, in general I'd say that knowingly producing code that won't compile is bad practice. In this particular case I can live with it as it is because 1) there actually is a warning message (even though it might pass unnoticed) and 2) once you know what the problem is it is easy to either ignore those methods or modify Boost.Python accordingly. For now, I just ignored them which was only one additional line in my script. In my opinion, what should actually be "fixed" here is the way important information is passed to the user. I think it's not so much of a problem that pyplusplus generated invalid code but it's more serious that I missed the warnings about that (if I had seen them I wouldn't have tried compiling the code in the first place). The problem is that a command line tool basically only has one single channel (stdout) to communicate with the user (having stderr as well is only a slight improvement as they both refer to the same "channel" by default, namely the console window). My suggestion would be to write the really important messages (like the max arity thing) into a separate log file in addition to writing to stdout (or actually it should be stderr). In my opinion this should already be the default behavior. So a user can check any time after the tool has run if there have been important issues. Before quitting, pyplusplus could even check if anything has been written into that log file and print a final message that there have been critical errors and the user should refer to that log file. >> In the code the maximum number (10) is hard-coded, shouldn't that be at >> least be user-settable > > Yes, I fixed and committed. > >> (or, if possible, automatically determined). > > It does not worth that, too much work ( make files, auto > configuration, scons, bjam, prj .... ) Yes, I agree. If it's not inside an official header file then it's not worth trying to get hold of the config files and parse them. >> What if the user actually did increase BOOST_PYTHON_MAX_ARITY? > > Then he will update decl_wrappers.calldef_t.BOOST_PYTHON_MAX_ARITY to the > actual number and will get rid of warning :-) Aha, there we are... :) (though I'd recommend to add functionality to the high level API to read/write this value so that the details of where the attribute is actually stored in pyplusplus are encapsulated) - Matthias - |
From: Roman Y. <rom...@gm...> - 2006-06-21 10:22:41
|
On 6/21/06, Matthias Baas <ba...@ir...> wrote: > Roman Yakovenko wrote: > > I am aware of the this problem ( lot of output ), so next code will > > write only important things: > > > > import logging > > from pyplusplus import module_builder > > module_builder.set_logger_level( logging.INFO ) > > I have that already in my script, but I still get a line for each > individual file that is written. Maybe these messages should be debug > messages instead of info messages (I think a summary of the time it took > to write the files and maybe how many files were actually updated would > be enough in the standard case). > (But apart from that there is still a lot of output from my own script > that would drown the message anyway) Do you want/have time to make pyplusplus messages really useful? Can you fix current situation? > In my opinion, what should actually be "fixed" here is the way important > information is passed to the user. I agree with you. The amount of information, that is written by pyplusplus is so big, that users just ignore it. This is the opposite of what I want. > I think it's not so much of a problem > that pyplusplus generated invalid code but it's more serious that I > missed the warnings about that (if I had seen them I wouldn't have tried > compiling the code in the first place). Would it be helpful, if pyplusplus will dump the information to the generated source files, too? So when you see compilation error you also can read an explanation. > The problem is that a command line tool basically only has one single > channel (stdout) to communicate with the user (having stderr as well is > only a slight improvement as they both refer to the same "channel" by > default, namely the console window). My suggestion would be to write the > really important messages (like the max arity thing) into a separate log > file in addition to writing to stdout (or actually it should be stderr). What is your definition/guide line of "really important"? Also I agree with you. > In my opinion this should already be the default behavior. So a user can > check any time after the tool has run if there have been important > issues. Before quitting, pyplusplus could even check if anything has > been written into that log file and print a final message that there > have been critical errors and the user should refer to that log file. Writing log to file is quite an improvement. On top of the file we can write small guide: search for word WARNING or ERROR. Or something like this. > > >> What if the user actually did increase BOOST_PYTHON_MAX_ARITY? > > > > Then he will update decl_wrappers.calldef_t.BOOST_PYTHON_MAX_ARITY to the > > actual number and will get rid of warning :-) > > Aha, there we are... :) > (though I'd recommend to add functionality to the high level API to > read/write this value so that the details of where the attribute is > actually stored in pyplusplus are encapsulated) :-) I will add new property to module_builder_t. That is what you meant, right? -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Lakin W. <lak...@gm...> - 2006-06-21 13:24:54
|
On 6/21/06, Matthias Baas <ba...@ir...> wrote: > Roman Yakovenko wrote: > > I am aware of the this problem ( lot of output ), so next code will > > write only important things: > > > > import logging > > from pyplusplus import module_builder > > module_builder.set_logger_level( logging.INFO ) > > I have that already in my script, but I still get a line for each > individual file that is written. Maybe these messages should be debug > messages instead of info messages (I think a summary of the time it took > to write the files and maybe how many files were actually updated would > be enough in the standard case). > (But apart from that there is still a lot of output from my own script > that would drown the message anyway) > > >> But as pyplusplus already knows that a particular member won't compile, > >> shouldn't it then refuse to write it to the output file (as it was the > >> case before)? Or even abort or something so that the user definitely > >> knows that some action is required on his side? > > > > Now when you have bad experience( and I am sorry for that ) you can > > give an advice. > > In my opinion pyplusplus did the job, but of course I could be wrong. > > Well, in general I'd say that knowingly producing code that won't > compile is bad practice. In this particular case I can live with it as > it is because 1) there actually is a warning message (even though it > might pass unnoticed) and 2) once you know what the problem is it is > easy to either ignore those methods or modify Boost.Python accordingly. > For now, I just ignored them which was only one additional line in my > script. > In my opinion, what should actually be "fixed" here is the way important > information is passed to the user. I think it's not so much of a problem > that pyplusplus generated invalid code I know that I'm just arguing semantics, but I disagree. The code _does_ compile and is therefore _not_ invalid. It just requires you to pass an extra switch the compiler, in this case: -DBOOST_PYTHON_MAX_ARITY=19 or somesuch. > but it's more serious that I > missed the warnings about that (if I had seen them I wouldn't have tried > compiling the code in the first place). > The problem is that a command line tool basically only has one single > channel (stdout) to communicate with the user (having stderr as well is > only a slight improvement as they both refer to the same "channel" by > default, namely the console window). My suggestion would be to write the > really important messages (like the max arity thing) into a separate log > file in addition to writing to stdout (or actually it should be stderr). > In my opinion this should already be the default behavior. So a user can > check any time after the tool has run if there have been important > issues. Before quitting, pyplusplus could even check if anything has > been written into that log file and print a final message that there > have been critical errors and the user should refer to that log file. I don't mind the current behavior as it would be easy for the user to hook into the logging module and write out messages above a certain importance to a separate log file. However, having a some sort of summary printed at the end of all the interesting things that a User may have to take care of would be nice. > >> In the code the maximum number (10) is hard-coded, shouldn't that be at > >> least be user-settable > > > > Yes, I fixed and committed. > > > >> (or, if possible, automatically determined). > > > > It does not worth that, too much work ( make files, auto > > configuration, scons, bjam, prj .... ) > > Yes, I agree. If it's not inside an official header file then it's not > worth trying to get hold of the config files and parse them. What about generating: #define BOOST_MAX_ARITY 19 at the top of file which contains those methods that are too long. I'm not sure about this particular option, but maybe it is possible to set this option on a per file basis. Lakin |
From: Lakin W. <lak...@gm...> - 2006-06-21 16:26:38
|
On 6/21/06, Lakin Wecker <lak...@gm...> wrote: > On 6/21/06, Matthias Baas <ba...@ir...> wrote: > > but it's more serious that I > > missed the warnings about that (if I had seen them I wouldn't have tried > > compiling the code in the first place). > > The problem is that a command line tool basically only has one single > > channel (stdout) to communicate with the user (having stderr as well is > > only a slight improvement as they both refer to the same "channel" by > > default, namely the console window). My suggestion would be to write the > > really important messages (like the max arity thing) into a separate log > > file in addition to writing to stdout (or actually it should be stderr). > > In my opinion this should already be the default behavior. So a user can > > check any time after the tool has run if there have been important > > issues. Before quitting, pyplusplus could even check if anything has > > been written into that log file and print a final message that there > > have been critical errors and the user should refer to that log file. > > I don't mind the current behavior as it would be easy for the user to > hook into the logging module and write out messages above a certain > importance to a separate log file. As a reference, http://docs.python.org/lib/multiple-destinations.html If pyplusplus is reporting this as a warning, then I think it's appropriate and that pyplusplus has done it's job. On a side-note, It would be nice to have a small wiki somewhere to aggregate all of these best practices for pyplusplus. Such as setting up multiple loggers in order to capture _all_ output somewhere for debugging purposes, and capturing only important pyplusplus messages to the console or another file for general success feedback purposes. Lakin |
From: Matthias B. <ba...@ir...> - 2006-06-22 12:31:22
|
Lakin Wecker wrote: >> information is passed to the user. I think it's not so much of a problem >> that pyplusplus generated invalid code > > I know that I'm just arguing semantics, but I disagree. The code > _does_ compile and is therefore _not_ invalid. It just requires you > to pass an extra switch the compiler, in this case: > -DBOOST_PYTHON_MAX_ARITY=19 or somesuch. Oops, I just noticed I misunderstood the whole thing. I thought I had to set the max arity when compiling *Boost.Python* instead of my own code. Now that I know I have to set this when compiling my own module, it's not so much of an issue anymore. And yes, the code generated by pyplusplus is not invalid in this case. But as you suggested, there could be an option so that pyplusplus just puts the appropriate #define right inside the file. > I don't mind the current behavior as it would be easy for the user to > hook into the logging module and write out messages above a certain > importance to a separate log file. Right, but why should I do that if I don't suspect any problems in the first place? My point is just that it always makes sense to dump serious error messages into a file for later inspection, so why not make this the default behavior? But that's only a minor issue, it's more important that pyplusplus internally categorizes its messages properly so that the user can filter out what is interesting to him. > However, having a some sort of summary printed at the end of all the > interesting things that a User may have to take care of would be nice. Agreed. - Matthias - |
From: Roman Y. <rom...@gm...> - 2006-06-21 18:22:58
|
On 6/21/06, Lakin Wecker <lak...@gm...> wrote: > What about generating: > #define BOOST_MAX_ARITY 19 > at the top of file which contains those methods that are too long. > I'm not sure about this particular option, but maybe it is possible to > set this option on a per file basis. I think that this is good idea. pyplusplus can generate some kind of configuration file and include it in every generated file. Thus user will have single place to configure all bindings. Thoughts? -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |
From: Lakin W. <lak...@gm...> - 2006-06-21 18:47:53
|
On 6/21/06, Roman Yakovenko <rom...@gm...> wrote: > On 6/21/06, Lakin Wecker <lak...@gm...> wrote: > > What about generating: > > #define BOOST_MAX_ARITY 19 > > at the top of file which contains those methods that are too long. > > I'm not sure about this particular option, but maybe it is possible to > > set this option on a per file basis. > > I think that this is good idea. pyplusplus can generate some kind of > configuration file > and include it in every generated file. Thus user will have single > place to configure > all bindings. I like this idea. Lakin Wecker |
From: Matthias B. <ba...@ir...> - 2006-06-22 14:17:44
|
Roman Yakovenko wrote: >> I have that already in my script, but I still get a line for each >> individual file that is written. Maybe these messages should be debug >> messages instead of info messages (I think a summary of the time it took >> to write the files and maybe how many files were actually updated would >> be enough in the standard case). >> (But apart from that there is still a lot of output from my own script >> that would drown the message anyway) > > Do you want/have time to make pyplusplus messages really useful? Can you > fix current situation? I'll have a look at it when I have some time left... >> I think it's not so much of a problem >> that pyplusplus generated invalid code but it's more serious that I >> missed the warnings about that (if I had seen them I wouldn't have tried >> compiling the code in the first place). > > Would it be helpful, if pyplusplus will dump the information to the > generated source > files, too? So when you see compilation error you also can read an > explanation. So far, I wouldn't have looked into the generated source code to find documentation text explaining why something failed but I like the idea. I think it wouldn't do any harm putting some extra comments into the source files if pyplusplus knows in advance that a particular construct might cause problems in some situations. >> default, namely the console window). My suggestion would be to write the >> really important messages (like the max arity thing) into a separate log >> file in addition to writing to stdout (or actually it should be stderr). > > What is your definition/guide line of "really important"? Also I agree > with you. As it turned out the max arity is not so important anymore as I thought (see my previous mail). But in general, I think the important messages are those that are supposed to trigger some user action (e.g. missing policies where a user must provide additional information to be able to get source code that compiles). >> > Then he will update decl_wrappers.calldef_t.BOOST_PYTHON_MAX_ARITY >> to the >> > actual number and will get rid of warning :-) >> >> Aha, there we are... :) >> (though I'd recommend to add functionality to the high level API to >> read/write this value so that the details of where the attribute is >> actually stored in pyplusplus are encapsulated) > > :-) I will add new property to module_builder_t. That is what you meant, > right? Right. :) - Matthias - |
From: Roman Y. <rom...@gm...> - 2006-06-22 17:45:27
|
On 6/22/06, Matthias Baas <ba...@ir...> wrote: > > Do you want/have time to make pyplusplus messages really useful? Can you > > fix current situation? > > I'll have a look at it when I have some time left... Thanks > As it turned out the max arity is not so important anymore as I thought > (see my previous mail). But in general, I think the important messages > are those that are supposed to trigger some user action (e.g. missing > policies where a user must provide additional information to be able to > get source code that compiles). Good definition. > Right. :) Yesterday I committed the patch that fixed the situation. -- Roman Yakovenko C++ Python language binding http://www.language-binding.net/ |