We use large data files (20MB of XML, 400k Nodes) to
represent tree structured data. Parsing the data and
immediately destroying the dom works without a problem.
After processing the data with multiple calls to the
selectNode and stringVal methods, the dom can't be
deleted using dom::destroy. The processor simply runs
at 100% load forever, but process memory usage doesn't
change (Machine is big enough to hold everything in RAM).
We did not check how many selectNode calls are
possible, though. But there must be quite a few...
This happened on different installations of Linux-i386.
Logged In: YES
user_id=648126
forgot to mention that we use tcldom::libxml2 3.1 from CVS
as of 2006-03-07.
Logged In: YES
user_id=81778
Are you able to supply a Tcl script that exercises this
problem? I'd ask for a sample data file too, but the
large size may be problematic. Perhaps you can post a
sample to a website and I can download it?
script to reproduce the problem
Logged In: YES
user_id=648126
I attached a script that reproduces the problem. It is very
simple, but does exactly the same thing.
Logged In: YES
user_id=648126
after playing with the example script, I found that it in
fact destroys the dom after some time (45 minutes). Given
that my original app makes use of text and attribute nodes,
and propably just takes half a day to destroy, this makes me
think that there is no failure in the code and we should
change the subject into something like:
"dom::destroy takes ages on large amounts of data".
Current workaround is to process each file in a subprocess
that ends after processing. Not very elegant, though.