From: Houman K. <kho...@we...> - 2004-04-29 06:44:08
|
Hi There, By now I am evaluating different XQuery solutions for my Master Thesis which gets hopefully published by the end of this year. I have a database with 2.3 Mio datasets and 3 columns. I do a stress test against SQL. So far non of the XQuery solutions could handle the 230 MB XML file. I hoped that eXist worked better since at the same time it is a XML DB and an XQuery solution, which should be more powerful. Indeed in eXist; I can at least do a 'count' over this huge doc and get a result back. (Not possible with other solutions yet) But if I try other selects, I get a java.lang.OutOfMemoryError. (Like other solutions) (In this example I should get 2.3 Mio fields called 'ticks' which is an integer value.) for $wb in doc("/db/ippm/wisc_berkeley.xml")/RESULTS/ROW return ($wb/ticks) Other example: (Just the 'ticks; from August 1th, 2000) (In this example I should get 80800 datasets but I still get a java.lang.OutOfMemoryError) for $wb in doc("/db/ippm/wisc_berkeley.xml")/RESULTS/ROW where ($wb/ndate='20000801000000') return ($wb/ticks) In my opinion, if XQuery can't handle this huge amount of data it can't compete against SQL. Is it possible to tune up eXist? I tried it already myself to tune it up: The startup.bat and client.bat have the parameter -Xmx256000k. I start client.bat -l to query the information locally. What else can be done? I have tried it with eXist-1.0b1.jar and the newest version from the CVS. I have a Pentium-M 1.7 GHz, 1 GB RAM. I appreciate any help, Regards Houman M. Khorasani University of Wisconsin Platteville |