RE: [xmljs-users] Parsing huge xml files !!
Brought to you by:
djoham,
witchhunter
|
From: James O. <jol...@bo...> - 2006-03-21 16:31:12
|
Ivo,
I suggest using DOM to process all those nodes. The DOM processing
should be well under 10 seconds for 5000 nodes. I've actually used DOM
processing on ~3000 nodes and it usually completed in 2-3 seconds. In
fact I did a side-by-side comparison of SAX to DOM using xmljs's SAX
parser and DOM was about a second faster (both in IE and Firefox). Also
when using DOM it's best to use getElementsByTagName() so if you have an
XML file that looks like the following:
<?xml version=3D"1.0"?>
<myFile>
<head>
<!-- header information, metadata etc... -->
</head>
<body>
<node>
<!-- node data -->
</node>
<node>
<!-- more node data -->
</node>
<node>
<!-- etc... -->
</node>
<node>
<!-- etc... -->
</node>
<node>
<!-- etc... -->
</node>
</body>
</myFile>
You'd use var myNodes =3D document.getElementsByTagName('node'); to grab =
a
full listing of all your file's nodes. You can then run through this
your myNodes array and process the subnodes or data however you wish.
Again, this runs surprisingly fast and my guess is that's because when
the XML file is initially loaded (either on initial load of the page or
via an AJAX call) the browser parses the file using an optimized parser
and then exposes the information to you via the DOM.=20
If you're having any trouble with the XML being processed slowly and
taking forever to display I recommend analyzing your display code, not
your processing code. Always output your HTML to a temporary variable
within a loop and once the loop is done pop that temporary node or
insert your temp vars innerHTML into your document, don't do it inside
the loop because it will drastically decrease your performance.
Good luck and let me know if you're having any further issues.
James Oltmans
|