The SAX parser would probably be the fastest, although the classic DOM might be very close. Either
way, you're probably well into go-get-a-cup-of-coffee-while-it-parses territory with 4000 nodes.
At this point, you may be better served to load the XML up into a native parser and accept the
browser limitations that requires.
I'm not familiar enough with json to say if it will help or not - sorry...
Best regards,
David
--- Ivo Benedito <ivo...@gm...> wrote:
> Hi,
>
> I need to parse XML Files with 5000 plus lines in javascript. Since that
> means a big break of performance in the web app, i had to pass that parsing
> to the server side with php. Then what i did, is removing the trash from the
> xml file that doesn't matter to me (tags that i dont need in the app) and
> save some thousands of lines. But even with that, i get some xml files with
> more than 4000 lines, and so it always gets a bit slowly to parse all that
> lines. Can u confirm me if the best way to parse that xml files is using sax
> parser or is it anything better to do it ?? Maybe i could try json 2 ... :|
>
|