[phpXML] Re: [phpXML] Large amounts of data
Brought to you by:
bs_php,
nigelswinson
From: <ph...@pe...> - 2001-10-08 16:19:51
|
On Friday 06 July 2001 17:30, you wrote: > I'm using phpXML to export/import object heirarchies in an application > server. I haven't run into the memory limit yet, but when the application > server starts getting some real use people may want to back up an entire > site. > > I'm thinking of creating an extension of phpXML that stores node content in > a temporary file instead of keeping it in memory. The method > set_external_content() will write to a file pointer and then set some node > attributes like ext_ctnt_position and ext_ctnt_length. The method > get_external_content() will use those attributes to seek to that position > in the file and grab the specified content. > > Any thoughts on this? Pitfalls? Interest? obvious disadvantage for one-off use is that it would be much slower. If this were reasonably static data, you could store the result in a permanent file as a sort of Xpath index, and then use that in your scripts - I quite like that idea, especially as other users could then use the index instead of rereading the file every time. xml files are no different from any other flatfile 'database': reading an entire large file to extract a small amount of data is not a very efficient thing to do. Without knowing your application, I would say as a rule of thumb, if xml files start getting large, either move to dbms with proper indexing or find some way to split into smaller files. -- This message has been sent through the <phpXML/> user discussion list. To unsubscribe, please visit https://sslsites.de/mailinglisten/user/us...@li.../ |