Menu

#159 dom::destroy hangs on large amounts of data

Script Library
open
TclDOM (91)
5
2006-03-23
2006-03-09
No

We use large data files (20MB of XML, 400k Nodes) to
represent tree structured data. Parsing the data and
immediately destroying the dom works without a problem.

After processing the data with multiple calls to the
selectNode and stringVal methods, the dom can't be
deleted using dom::destroy. The processor simply runs
at 100% load forever, but process memory usage doesn't
change (Machine is big enough to hold everything in RAM).

We did not check how many selectNode calls are
possible, though. But there must be quite a few...

This happened on different installations of Linux-i386.

Discussion

  • Hanno Steinke

    Hanno Steinke - 2006-03-09

    Logged In: YES
    user_id=648126

    forgot to mention that we use tcldom::libxml2 3.1 from CVS
    as of 2006-03-07.

     
  • Steve Ball

    Steve Ball - 2006-03-23
    • assigned_to: nobody --> balls
     
  • Steve Ball

    Steve Ball - 2006-03-23

    Logged In: YES
    user_id=81778

    Are you able to supply a Tcl script that exercises this
    problem? I'd ask for a sample data file too, but the
    large size may be problematic. Perhaps you can post a
    sample to a website and I can download it?

     
  • Hanno Steinke

    Hanno Steinke - 2006-03-23

    script to reproduce the problem

     
  • Hanno Steinke

    Hanno Steinke - 2006-03-23

    Logged In: YES
    user_id=648126

    I attached a script that reproduces the problem. It is very
    simple, but does exactly the same thing.

     
  • Hanno Steinke

    Hanno Steinke - 2006-03-24

    Logged In: YES
    user_id=648126

    after playing with the example script, I found that it in
    fact destroys the dom after some time (45 minutes). Given
    that my original app makes use of text and attribute nodes,
    and propably just takes half a day to destroy, this makes me
    think that there is no failure in the code and we should
    change the subject into something like:

    "dom::destroy takes ages on large amounts of data".

    Current workaround is to process each file in a subprocess
    that ends after processing. Not very elegant, though.

     

Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.