[Kernow] xquery on large data: memory increase? or something else?
Brought to you by:
ajwelch
From: Dr. C. D. <du...@te...> - 2012-04-10 10:39:26
|
Hello, I want to run XQuery on medium to large files (from 100 MB to 600 MB of html code). If I run the query on a portion of these files that is smaller than 100 MB, everything runs well ... As soon as I want to run the query on the complete file, Kernow runs for a while and then it seems it does not run at all anymore, eventhough I do not get the hand back. CPU consumption goes from 30%-50% down to 1%-3%. Memory consumption went high ...from 700 MB up to 1,3 GB but I have plenty memory left. I have changed the JRE parameters to -Xmx1024m or even -Xmx2048m (within the control panel /java/java/java runtime environment parameter settings) But it does not change anything ... I still have this magic limit of 100MB of data I can run an xQuery onto. What can I do? Where can I tell Kernow to effectively use more memory? Or is the problem another one? I run a windows XP SP3 PC with 4GB of physical mem Thanks Christian |