I've had some discussion with a couple Pyro developers about being able
to use xml_pickle with Pyro (in place of pickle/cPickle). It looks like
it works now, but we need to do a 'setParanoia(-1)' instead of
'setParanoia(0)'. That's not really a big thing, since even -1 avoids
cPickle's trojan problem. But our documentation claimed 0 was "cPickle
compatible". Anyway, it looks like my xml_pickle co-author has
identified the problem, and the next little patch (probably 1.0.1) will
fix that. The issue probably has to do with some indirect imports
On perhaps a more serious issue, Frank McIngvale (said co-author) has
been working on a C extension parser for xml_pickle. Preliminary
results seem promising. When and if this extension is part of our
release, it will be a drop-in replacement for other parsers, selected
with 'setParser()'... the API and file format will be the same. Anyway,
this is what Frank has indicated in terms of speed (the test_speed.py
example in our inventory also contains comments comparing our
DOM/SAX/cEXPAT parsers--preliminary numbers):
-------- Forwarded message --------
Date: Mon, 13 May 2002 11:05:23 -0500 (CDT)
From: Frank McIngvale <frankm@...>
To: "Dr. David Mertz" <mertz@...>
Hm, looks like cPickle really isn't that fast at loading.
I'm working on a new speed test, right now it's ~5Mb of pure
data objs (ints,floats,strs,dicts,lists), randomly generated
to ensure no reffing is done. Here are the times:
So (unless cEXPAT is horribly broken :-) we should easily
beat cPickle at loading. Dumping looks pretty bad, but
we're pure-python now, and a few carefully placed C functions
(like the _tag_ funcs?) might do a world of good there as well.
cEXPAT load SHOULD be on the order of 4 secs, so a dumps()
time of 8 secs would equal cPickle in overall time. That's <3x speedup,
surely we can do that!
I'll try cEXPAT soon to confirm the numbers ...