Eric M. Ludlam wrote:
> M-: (eieio-persistent-save semanticdb-current-database) RET
> which will do the save, and allow errors to be caught by the
> debugger. This stack, will probably point at writing out some list
> that has pre-processor symbols in it. I worked around that a couple
> days ago, but your header may be slightly different.
OK, well here is the stack dump: http://pastie.org/214130.txt
I have not the expertise to work out what it means, but I can't help
noticing that the first macro in the list is BOOST_PP_REPEAT_2_I, which
is defined thus:
# define BOOST_PP_REPEAT_2_I(c, m, d) BOOST_PP_REPEAT_2_ ## c(m, d)
Is the token pasting used here causing the problem? Or does it point in
the right direction?
> If the issue is in the preprocessor writer, perhaps the limiter I
> placed there is too large. Try shrinking:
> to 1, which should certainly fix it. Then increase it to 10, or
> whichever is needed for any macros you require out of your header
> files. You can change this variable, then try:
> M-x semanticdb-save-db
OK so that seemed to work initially. I reduced it to 1, did a
semanticdb-save-all-db and all seemed well. Specifically, there was a
cache file created for this directory.
A bit later and it is obviously parsing other files, and I'm still
getting the error, even with the max length still set to 1. In this case
there are two errors, in a slightly different location:
> Alternatively, use semantic--before-fetch-tags-hook to shut things off
> for particular files. Something like this:
> (add-hook 'semantic--before-fetch-tags-hook
> (lambda () (if (string-match "^/path/I/dont/like"
That works nicely, thanks.