From: Jeff D. <da...@da...> - 2003-03-27 21:07:19
|
> When we take a large wiki node, edit it, preview it, and save it, whole > paragraphs are duplicated in the result. This is not operator error - > it's happened several times, even from cut-and-pasting from a text > editor window! I'm unclear as to what is causing this corruption, but it > has rendered the wiki all but unuseable for large documents. How large is large? Could the problem be browser dependent? (I vaguely remember stories about some browsers having issues when the data in the textareas gets to be larger than a certain size (32K or 64K).) > There are other somewhat less serious issues, too -- when a wiki node > has a large table in it (>200 rows) it simply freezes up the machine. Is this a "new-style" table or an old-style table (via the OldStyleTable plugin)? Is there an example at a publicly accessable URL? Or can you send me some example wiki-text which will trigger the bug? (Do you really mean "freezes up the machine"? Do other concurrent requests to the web server get hung?) > "Document History" does not > appear to work correctly for documents that have been edited a large > number of times. (It does not show the most recent changes made.) Also, > our listing of RecentChanges seems to have stopped about seven days ago. Those problems sound related. I've never seen nor heard of behavior like that before. Are you sure that things are happy on the MySQL front? (Run (my)isamchk on the tables. Any filesystem errors? Enough disk space?) > Have any of you guys seen this? I haven't. > if (php_sapi_name() == 'apache' || php_sapi_name() == 'apache2filter') (This has been fixed in CVS.) > Could it be something about our Apache2 interactions that's messing > things up? I doubt it, but I don't have an alternative theory either, so who knows? |