From: Toby D. <to...@ta...> - 2007-02-14 18:51:13
|
On Wednesday 14 February 2007 17:29, Ed Leafe wrote: > On Feb 13, 2007, at 3:13 AM, Toby Dickenson wrote: > >> DirectoryStorageError: Pickle checksum error reading oid > >> '000000000002BD7C' > > > > Thats the result of the other bit of the quoted email.... > > Not to be a pain, but did you learn anything from the stuff I sent > you? Which "stuff"? If you mean the traceback, no, that didnt tell me anything I didnt expect. If youve sent some other stuff, it didnt arrive. I have had some other emails delayed or lost over the weekend, so please resend. > I'm doing incremental backups every night, and even though very > little changes (maybe a couple fo ZWiki pages), I'm getting over > 100MB files each night, *after* gzipping them. > > Is there anything I can do to fix this? I suggest untarring a typical backup and examining the content. It will contain one file for every new revision of every modified object. You can use the dumpdsf tool on those files (or a subset of them, if there are too many) to see which classes are causing the backup bloat. My guess is that you are using ZCatalog, and your ZWiki page changes are causing many BTree node objects to be modified. I'm not aware of any way to avoid that. -- Toby Dickenson |