From: Reini U. <ru...@x-...> - 2004-05-01 18:14:29
|
I would appreciate if someone can help me finding the cause of the problem with the current CVS code. See http://phpwiki.sourceforge.net/demo/ It looks like a InlineParser or BlockParser problem. The last change was the new SimpleMarkup Markup_plugin, which works fine for me. Even without this new markup it breaks. Actions without display do work fine at sf.net. I tested it with the login.tmpl which is displayed fine without the $t = asXML(TransformText(_("You may sign in using any [WikiWord|AddingPages] as a user id. (Any characters in %s etc. may be used too). The user id will be used as a link in RecentChanges to your home page."), 2.0, true)); line, but breaks with the TransformText() function. But it could also be a subtle config problem. This is very fresh also. It works for me, only at the sf.net site it breaks. So if the current CVS code breaks for someone also, and he is able to debug it, this would be great. -- Reini Urban http://xarch.tu-graz.ac.at/home/rurban/ |
From: Reini U. <ru...@x-...> - 2004-05-06 19:34:50
|
Reini Urban schrieb: > I would appreciate if someone can help me finding the cause of > the problem with the current CVS code. > See http://phpwiki.sourceforge.net/demo/ > > It looks like a InlineParser or BlockParser problem. > The last change was the new SimpleMarkup Markup_plugin, > which works fine for me. Even without this new markup it breaks. > Actions without display do work fine at sf.net. > > I tested it with the login.tmpl which is displayed fine without the > $t = asXML(TransformText(_("You may sign in using any > [WikiWord|AddingPages] as a user id. (Any characters in %s etc. may be > used too). The user id will be used as a link in RecentChanges to your > home page."), 2.0, true)); > line, but breaks with the TransformText() function. > > But it could also be a subtle config problem. This is very fresh also. > > It works for me, only at the sf.net site it breaks. > So if the current CVS code breaks for someone also, > and he is able to debug it, this would be great. Apparently nobody was able to reproduce this problem, as it happens on sf.net. I added now some debugging options to avoid the endless loop in the InlineParser, which doesn't solve the problem, but helps the poor sf.net server and displays an almost empty page. With DEBUG = on you get this behavior, with DEBUG = false you get the old assertion errors and will not be able to backup or fix anything. http://phpwiki.sourceforge.net/demo/ -- Reini Urban http://xarch.tu-graz.ac.at/home/rurban/ |
From: Dan F. <dfr...@cs...> - 2004-05-07 16:54:15
|
Reini Urban wrote: > Reini Urban schrieb: > >> I would appreciate if someone can help me finding the cause of >> the problem with the current CVS code. >> See http://phpwiki.sourceforge.net/demo/ >> >> It looks like a InlineParser or BlockParser problem. >> The last change was the new SimpleMarkup Markup_plugin, >> which works fine for me. Even without this new markup it breaks. >> Actions without display do work fine at sf.net. >> >> I tested it with the login.tmpl which is displayed fine without the >> $t = asXML(TransformText(_("You may sign in using any >> [WikiWord|AddingPages] as a user id. (Any characters in %s etc. may >> be used too). The user id will be used as a link in RecentChanges to >> your home page."), 2.0, true)); >> line, but breaks with the TransformText() function. >> >> But it could also be a subtle config problem. This is very fresh also. >> >> It works for me, only at the sf.net site it breaks. >> So if the current CVS code breaks for someone also, >> and he is able to debug it, this would be great. > > > Apparently nobody was able to reproduce this problem, as it happens on > sf.net. > I added now some debugging options to avoid the endless loop in the > InlineParser, which doesn't solve the problem, but helps the poor > sf.net server and displays an almost empty page. > With DEBUG = on you get this behavior, with DEBUG = false you get the > old assertion errors and will not be able to backup or fix anything. > > http://phpwiki.sourceforge.net/demo/ I was not able to reproduce the problem. I might work on debugging the SourceForge site if I had access to it. What's the word on that? Dan |
From: Reini U. <ru...@x-...> - 2004-05-07 19:59:24
|
Dan Frankowski schrieb: > I was not able to reproduce the problem. I might work on debugging the > SourceForge site if I had access to it. What's the word on that? The problem is for us developers, that we can only do printf-style debugging on sf.net, and this only for one day. On the next night the whole content will be overwritten by steve's automatic cvs update script. But most importantly we cannot use a php debugger there. So I would love to see anyone with debugging skills, on whose site this problem also exists. This would narrow the search for the possible problem. Yesterday I could reproduce a similar problem with an endless loop in the InlineParser, and so at least some parts of the website are accessible again. I also found a strange dba session problem with this special php version 4.1.2 and 4.1.1, but not really related to my current problem. -- Reini Urban http://xarch.tu-graz.ac.at/home/rurban/ |
From: Joby W. <joby@u.washington.edu> - 2004-05-07 20:19:25
|
I haven't been able to reproduce the error. Which is too bad because I have just finished setting up Xdebug. Joby Walker C&C Computer Operations Software Support Group Reini Urban wrote: > Dan Frankowski schrieb: > >> I was not able to reproduce the problem. I might work on debugging the >> SourceForge site if I had access to it. What's the word on that? > > > The problem is for us developers, that we can only do printf-style > debugging on sf.net, and this only for one day. On the next night the > whole content will be overwritten by steve's automatic cvs update script. > > But most importantly we cannot use a php debugger there. > So I would love to see anyone with debugging skills, on whose site this > problem also exists. This would narrow the search for the possible problem. > > Yesterday I could reproduce a similar problem with an endless loop in > the InlineParser, and so at least some parts of the website are > accessible again. > I also found a strange dba session problem with this special php version > 4.1.2 and 4.1.1, but not really related to my current problem. |
From: Reini U. <ru...@x-...> - 2004-05-08 19:03:43
|
I found now the problem with the current InlineParser, why it fails only on sf.net: The problem is that the php at sf.net has less memory for regular expressions than a typical php, both have an 8M memory_limit, but somehow anchored pcre regex obviously allocate from somewhere else. The problem is RegexpSet::_match with the huge regexp string, which now with the added Inline plugin markup overflow its limit. The pattern is contructed from $pat= "/ ( . $repeat ) ( (" . join(')|(', $regexps) . ") ) /Asx"; The modifier A (ANCHORED) tells pcre to store the block, regexps is an array of 10 rather complicated regex strings, and repeat starts from "*?" to {nn} towards the end, so that the prematch gets longer and longer, until nothing is found anymore and the final "$" regexps matches. This ends the loop. On sf.net we don't have an endless loop, we rather run out of memory, because of the continued anchored matching of the same huge regexp, until repeat gets large enough. The /A tells pcre to store the matching block to notify match() which regexps actually matched, and to be able to recurse into shorter substrings then. I rewrote now that critical part to be somewhat slower, but to need much less memory. We don't really need to string-join the regexps array together. It is sufficient to loop through all regexps until one balanced or simple markup matches. The problem is that the longest substring should be favoured, so that it recurses into matches, that's what /A is for. e.g. for "<small>*WikiWord*</small>" it has to match at first the balanced <small> tag, than the *...* emphasis and at last the wikiword inside. The hugest partial regexp is the interwiki map which constructs "(moniker1:|moniker2:|moniker3:|moniker4:|moniker5:|moniker6:|moniker7:|...)" Without this regexp it doesn't run out of memory anymore. Joby Walker schrieb: > I haven't been able to reproduce the error. Which is too bad because I > have just finished setting up Xdebug. > > Reini Urban wrote: >> Dan Frankowski schrieb: >> >>> I was not able to reproduce the problem. I might work on debugging >>> the SourceForge site if I had access to it. What's the word on that? >> >> The problem is for us developers, that we can only do printf-style >> debugging on sf.net, and this only for one day. On the next night the >> whole content will be overwritten by steve's automatic cvs update script. >> >> But most importantly we cannot use a php debugger there. >> So I would love to see anyone with debugging skills, on whose site >> this problem also exists. This would narrow the search for the >> possible problem. >> >> Yesterday I could reproduce a similar problem with an endless loop in >> the InlineParser, and so at least some parts of the website are >> accessible again. >> I also found a strange dba session problem with this special php >> version 4.1.2 and 4.1.1, but not really related to my current problem. -- Reini Urban http://xarch.tu-graz.ac.at/home/rurban/ |
From: Reini U. <ru...@x-...> - 2004-05-08 22:55:57
|
Reini Urban schrieb: > I found now the problem with the current InlineParser, why it fails only > on sf.net: > The problem is that the php at sf.net has less memory for regular > expressions than a typical php, both have an 8M memory_limit, but > somehow anchored pcre regex obviously allocate from somewhere else. Problem on http://phpwiki.sf.net/demo/ fixed. It was not the memory, it was an endless loop, caused by an empty definition of WIKI_NAME_REGEXP, which I fixed now in IniConfig.php. Exactly this constant wasn't checked for its default setting. Anyway the huge regexp string is now gone also, and the whole inline parsing is now a lot better, falling back to the previous hairy code only if two conflicting markups are found in the same block. > The problem is RegexpSet::_match with the huge regexp string, which now > with the added Inline plugin markup overflow its limit. > > The pattern is contructed from > $pat= "/ ( . $repeat ) ( (" . join(')|(', $regexps) . ") ) /Asx"; > The modifier A (ANCHORED) tells pcre to store the block, regexps is an > array of 10 rather complicated regex strings, and repeat starts from > "*?" to {nn} towards the end, so that the prematch gets longer and > longer, until nothing is found anymore and the final "$" regexps > matches. This ends the loop. > > On sf.net we don't have an endless loop, we rather run out of memory, > because of the continued anchored matching of the same huge regexp, > until repeat gets large enough. The /A tells pcre to store the matching > block to notify match() which regexps actually matched, and to be able > to recurse into shorter substrings then. > > I rewrote now that critical part to be somewhat slower, but to need much > less memory. > We don't really need to string-join the regexps array together. > It is sufficient to loop through all regexps until one balanced or > simple markup matches. > The problem is that the longest substring should be favoured, so that it > recurses into matches, that's what /A is for. > e.g. for "<small>*WikiWord*</small>" it has to match at first the > balanced <small> tag, than the *...* emphasis and at last the wikiword > inside. > > The hugest partial regexp is the interwiki map which constructs > "(moniker1:|moniker2:|moniker3:|moniker4:|moniker5:|moniker6:|moniker7:|...)" -- Reini Urban http://xarch.tu-graz.ac.at/home/rurban/ |