From: Richa G. ,G. <ric...@hc...> - 2007-10-22 04:44:38
|
While applying xsl on a large xml file (say 20 mb), we are getting OutOfMemoryError=2E org=2Exmldb=2Eapi=2Ebase=2EXMLDBException: java=2Elang=2EException: java=2Elang=2EOutOfMemoryError: Java heap space =20 Following is the stacktrace for the error from exist=2Elog=2E =20 2007-10-22 10:08:27,512 [P1-9] DEBUG (ServletHandler=2Ejava [getRealPath]:720) - getRealPath of /apply_xsl=2Exqy in org=2Emortbay=2Ejetty=2Eservlet=2EWebApplicationHandler in WebApplicationContext[eXist XML Database,eXist XML Database]=20 2007-10-22 10:08:27,512 [P1-9] DEBUG (ServletHandler=2Ejava [getRealPath]:720) - getRealPath of / in org=2Emortbay=2Ejetty=2Eservlet=2EWebApplicationHandler in WebApplicationContext[eXist XML Database,eXist XML Database]=20 2007-10-22 10:08:27,965 [P1-9] DEBUG (XQuery=2Ejava [compile]:154) - Query diagnostics: let <33>=20 $root :=3D xmldb:collection("xmldb:exist:///db", "admin", "admin") return=20 transform:transform(doc("/db/mytest/main=2Exml"), doc("/db/xsl/myXSL=2Exsl"), )=20 2007-10-22 10:08:27,965 [P1-9] DEBUG (XQuery=2Ejava [compile]:156) - Compilation took 375=20 =20 =2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E=2E =20 =20 2007-10-22 10:08:32,465 [P1-9] DEBUG (NativeSerializer=2Ejava [serializeToReceiver]:129) - serializing document 402 (/db/mytest/main=2Exml) to SAX took 4172=20 2007-10-22 10:08:40,418 [P1-9] WARN (ServletHandler=2Ejava [handle]:574) - Error for /exist/apply_xsl=2Exqy=20 java=2Elang=2EOutOfMemoryError: Java heap space =20 =20 Please suggest Is this some limitation of eXist? Also provide some solution to solve this problem=2E=20 DISCLAIMER: ---------------------------------------------------------------------------= -------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and= intended for the named recipient(s) only=2E It shall not attach any liability on the originator or HCL or its= affiliates=2E Any views or opinions presented in=20 this email are solely those of the author and may not necessarily reflect= the opinions of HCL or its affiliates=2E Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of=20 this message without the prior written consent of the author of this e-mail= is strictly prohibited=2E If you have received this email in error please delete it and notify the sender= immediately=2E Before opening any mail and=20 attachments please check them for viruses and defect=2E ---------------------------------------------------------------------------= -------------------------------------------- |
From: Pierrick B. <pie...@fr...> - 2007-10-22 05:45:06
|
Hi, Richa Gupta ,Gurgaon a écrit : > While applying xsl on a large xml file (say 20 mb), we are getting > OutOfMemoryError. > > _org.xmldb.api.base.XMLDBException_: _java.lang.Exception_: > java.lang.OutOfMemoryError: Java heap space > Please suggest Is this some limitation of eXist? Probably not. Rather a limitation of your Java Virtual Machine. > Also provide some solution to solve this problem. Please ? Search the mailing-list archives : this question if a frequently asked one. Cheers, p.b. |
From: Richa G. ,G. <ric...@hc...> - 2007-10-22 10:14:05
|
Thanks for you suggestion=2E We looked on forum for OutOfMemoryError=2E And made changes for JVM cache= and heap size in config=2Exml & startup=2Esh file=2E In conf=2Exml, values for db-connection are:=20 <db-connection cacheSize=3D"48M" collectionCacheSize=3D"128" database= =3D"native" files=3D"webapp/WEB-INF/data" free_mem_min=3D"5" pageSize= =3D"4096"> In startup=2Esh, we have=20 if [ -z "$JAVA_OPTIONS" ]; then export JAVA_OPTIONS=3D"-Xms16000k -Xmx768000k -Dfile=2Eencoding= =3DUTF-8" But we are still getting the same error=2E Please suggest a way to solve= this problem=2E We are using xmldb's XQueryService to run transform xquery=2E Thanks, Richa -----Original Message----- From: Pierrick Brihaye [mailto:pierrick=2Ebrihaye@free=2Efr]=20 Sent: Monday, October 22, 2007 11:15 AM To: Richa Gupta ,Gurgaon Cc: Exist-open@lists=2Esourceforge=2Enet Subject: Re: [Exist-open] problem while processing large xmls Hi, Richa Gupta ,Gurgaon a =E9crit : > While applying xsl on a large xml file (say 20 mb), we are getting=20 > OutOfMemoryError=2E >=20 > _org=2Exmldb=2Eapi=2Ebase=2EXMLDBException_: _java=2Elang=2EException_:=20 > java=2Elang=2EOutOfMemoryError: Java heap space > Please suggest Is this some limitation of eXist? Probably not=2E Rather a limitation of your Java Virtual Machine=2E > Also provide some solution to solve this problem=2E=20 Please ? Search the mailing-list archives : this question if a frequently asked one= =2E Cheers, p=2Eb=2E DISCLAIMER: ---------------------------------------------------------------------------= -------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and= intended for the named recipient(s) only=2E It shall not attach any liability on the originator or HCL or its= affiliates=2E Any views or opinions presented in=20 this email are solely those of the author and may not necessarily reflect= the opinions of HCL or its affiliates=2E Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of=20 this message without the prior written consent of the author of this e-mail= is strictly prohibited=2E If you have=20 received this email in error please delete it and notify the sender= immediately=2E Before opening any mail and=20 attachments please check them for viruses and defect=2E ---------------------------------------------------------------------------= -------------------------------------------- |
From: Christofer D. <mai...@c-...> - 2007-10-22 10:28:40
|
Hi, I think you might be experiencing a problem with the memorymanager of your java virtual machine. The problem with this ist, that everytime the VM increases the amount of allocated memory, it expects this to be in one block. Whenever there is no unfragmented block able to contain all the VM memory, an OutOfMemory exception occurs. I would suggest increasing the Xms and seting the Xmx value to the same value as Xms. This way the entire block is allocated during VM startup and OutOfMemory exceptions should no longer occur. In general when transforming very large Xml documents, I would suggest using Sax to do the transformation. When doing a Dom transformation of a 20MB Xml document could result in a memory consumption of about 50-50MB depending on the document structure and the transformation. Hope this Helps ;) Chris Richa Gupta ,Gurgaon schrieb: > Thanks for you suggestion. > > We looked on forum for OutOfMemoryError. And made changes for JVM cache and heap size in config.xml & startup.sh file. > > In conf.xml, values for db-connection are: > <db-connection cacheSize="48M" collectionCacheSize="128" database="native" files="webapp/WEB-INF/data" free_mem_min="5" pageSize="4096"> > > In startup.sh, we have > if [ -z "$JAVA_OPTIONS" ]; then > export JAVA_OPTIONS="-Xms16000k -Xmx768000k -Dfile.encoding=UTF-8" > > But we are still getting the same error. Please suggest a way to solve this problem. > > We are using xmldb's XQueryService to run transform xquery. > > Thanks, > Richa > > > -----Original Message----- > From: Pierrick Brihaye [mailto:pie...@fr...] > Sent: Monday, October 22, 2007 11:15 AM > To: Richa Gupta ,Gurgaon > Cc: Exi...@li... > Subject: Re: [Exist-open] problem while processing large xmls > > Hi, > > Richa Gupta ,Gurgaon a écrit : > > >> While applying xsl on a large xml file (say 20 mb), we are getting >> OutOfMemoryError. >> >> _org.xmldb.api.base.XMLDBException_: _java.lang.Exception_: >> java.lang.OutOfMemoryError: Java heap space >> > > >> Please suggest Is this some limitation of eXist? >> > > Probably not. Rather a limitation of your Java Virtual Machine. > > >> Also provide some solution to solve this problem. >> > > Please ? > > Search the mailing-list archives : this question if a frequently asked one. > > Cheers, > > p.b. > > DISCLAIMER: > ----------------------------------------------------------------------------------------------------------------------- > > The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. > It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in > this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates. > Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of > this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have > received this email in error please delete it and notify the sender immediately. Before opening any mail and > attachments please check them for viruses and defect. > > ----------------------------------------------------------------------------------------------------------------------- > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > Exist-open mailing list > Exi...@li... > https://lists.sourceforge.net/lists/listinfo/exist-open > |
From: Dannes W. <di...@gm...> - 2007-10-22 08:02:32
|
hi, On 10/22/07, Richa Gupta ,Gurgaon <ric...@hc...> wrote: > While applying xsl on a large xml file (say 20 mb), we are getting OutOfMemoryError. > java.lang.Exception: java.lang.OutOfMemoryError: Java heap space > Please suggest Is this some limitation of eXist? Also provide some solution to solve this problem. I guess that if you do this on 'command line' you'll see the same error. No probably it is not an issue of eXist but from the XSLT engine. solution? - rewrite your xsl script - use Saxon (mr. Kay has optimized his XSLT engine for speed, memory consumption...), you can configure eXist to use Saxon very easy. D. -- # Dannes Wessels # The Netherlands # |
From: Richa G. ,G. <ric...@hc...> - 2007-10-22 10:20:07
|
Hi, Thanx for ur suggestion Dannes=2E But we have already configured eXist to use Saxon=2E We applied xsl on small xml files (of size upto 1 mb)=2E It worked fine=2E= =20 But while applying the same xsl on large data (say 10 - 15 mb), we get OutOfMemoryError=2E So it is not a problem with XSLT engine but with some JVM configuration=2E Please suggest any solution for same=2E Thanks, Richa -----Original Message----- From: Dannes Wessels [mailto:dizzzz@gmail=2Ecom]=20 Sent: Monday, October 22, 2007 1:32 PM To: Richa Gupta ,Gurgaon Cc: Exist-open@lists=2Esourceforge=2Enet Subject: Re: [Exist-open] problem while processing large xmls hi, On 10/22/07, Richa Gupta ,Gurgaon <richa=2Egupta@hcl=2Ein> wrote: > While applying xsl on a large xml file (say 20 mb), we are getting OutOfMemoryError=2E > java=2Elang=2EException: java=2Elang=2EOutOfMemoryError: Java heap space= =20 > Please suggest Is this some limitation of eXist? Also provide some solution to solve this problem=2E I guess that if you do this on 'command line' you'll see the same error=2E No probably it is not an issue of eXist but from the XSLT engine=2E solution? - rewrite your xsl script - use Saxon (mr=2E Kay has optimized his XSLT engine for speed, memory consumption=2E=2E=2E), you can configure eXist to use Saxon very easy=2E D=2E -- # Dannes Wessels # The Netherlands # DISCLAIMER: ---------------------------------------------------------------------------= -------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and= intended for the named recipient(s) only=2E It shall not attach any liability on the originator or HCL or its= affiliates=2E Any views or opinions presented in=20 this email are solely those of the author and may not necessarily reflect= the opinions of HCL or its affiliates=2E Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of=20 this message without the prior written consent of the author of this e-mail= is strictly prohibited=2E If you have received this email in error please delete it and notify the sender= immediately=2E Before opening any mail and=20 attachments please check them for viruses and defect=2E ---------------------------------------------------------------------------= -------------------------------------------- |
From: Wolfgang M. <wol...@gm...> - 2007-10-22 10:28:38
|
> Thanx for ur suggestion Dannes. But we have already configured eXist to > use Saxon. > > We applied xsl on small xml files (of size upto 1 mb). It worked fine. > But while applying the same xsl on large data (say 10 - 15 mb), we get > OutOfMemoryError. 768mb memory should be more than sufficient for an XSL transformation on a small 15mb document. Are you executing the query on an idle database or are there other queries running in parallel? Wolfgang |
From: Dannes W. <di...@gm...> - 2007-10-22 11:51:11
|
Hi, On 10/22/07, Richa Gupta ,Gurgaon <ric...@hc...> wrote: > So it is not a problem with XSLT engine but with some JVM configuration. > Please suggest any solution for same. well, run the transformation outside the database (command line, oxygenxml, ...) and check the behaviour. Running java5/6... you could use jconsole to monitor the jvm D. -- # Dannes Wessels # The Netherlands # |
From: Adam R. <ada...@de...> - 2007-10-22 10:26:14
|
Try running the transformation with Saxon outside of eXist, and watch = the memory use, this should give you an idea of how much memory your = transform requires. It would seem that it needs more than 768MB, but I = find that surprising! -----Original Message----- From: exi...@li... on behalf of Richa Gupta = ,Gurgaon Sent: Mon 22/10/2007 11:18 To: Dannes Wessels Cc: Exi...@li... Subject: Re: [Exist-open] problem while processing large xmls =20 Hi, Thanx for ur suggestion Dannes. But we have already configured eXist to use Saxon. We applied xsl on small xml files (of size upto 1 mb). It worked fine.=20 But while applying the same xsl on large data (say 10 - 15 mb), we get OutOfMemoryError. So it is not a problem with XSLT engine but with some JVM configuration. Please suggest any solution for same. Thanks, Richa -----Original Message----- From: Dannes Wessels [mailto:di...@gm...]=20 Sent: Monday, October 22, 2007 1:32 PM To: Richa Gupta ,Gurgaon Cc: Exi...@li... Subject: Re: [Exist-open] problem while processing large xmls hi, On 10/22/07, Richa Gupta ,Gurgaon <ric...@hc...> wrote: > While applying xsl on a large xml file (say 20 mb), we are getting OutOfMemoryError. > java.lang.Exception: java.lang.OutOfMemoryError: Java heap space=20 > Please suggest Is this some limitation of eXist? Also provide some solution to solve this problem. I guess that if you do this on 'command line' you'll see the same error. No probably it is not an issue of eXist but from the XSLT engine. solution? - rewrite your xsl script - use Saxon (mr. Kay has optimized his XSLT engine for speed, memory consumption...), you can configure eXist to use Saxon very easy. D. -- # Dannes Wessels # The Netherlands # DISCLAIMER: -------------------------------------------------------------------------= ---------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and = intended for the named recipient(s) only. It shall not attach any liability on the originator or HCL or its = affiliates. Any views or opinions presented in=20 this email are solely those of the author and may not necessarily = reflect the opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, = modification, distribution and / or publication of=20 this message without the prior written consent of the author of this = e-mail is strictly prohibited. If you have received this email in error please delete it and notify the sender = immediately. Before opening any mail and=20 attachments please check them for viruses and defect. -------------------------------------------------------------------------= ---------------------------------------------- -------------------------------------------------------------------------= This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/ _______________________________________________ Exist-open mailing list Exi...@li... https://lists.sourceforge.net/lists/listinfo/exist-open |
From: Richa G. ,G. <ric...@hc...> - 2007-10-23 10:46:00
|
Hi, We made changes in startup=2Esh (-Xms256000k -Xmx768000k) and=20 conf=2Exml (<db-connection cacheSize=3D"48M" collectionCacheSize=3D"128" database=3D"native" files=3D"webapp/WEB-INF/data" free_mem_min=3D"5" pageSize=3D"4096">)=2E Our requirement is to apply a series of xsl on an xml in repository=2E The output of one xsl is the input xml for the next xsl=2E For this output is saved on the file system as a new xml file and then imported to eXist repository=2E We are using XQueryService to apply xsl on xml in repository=2E XQueryService returns a ResourceSet=2E We iterate through the ResourceSet and use getContent method to fetch the output into a string=2E This string is then written to a file in file system=2E Below is the snippet of our code:=20 ResourceSet result =3D service=2Equery(query); if(result !=3D null){ ResourceIterator resIterator =3D result=2EgetIterator(); while(resIterator=2EhasMoreResources()) { Resource r =3D resIterator=2EnextResource(); value =3D (String)r=2EgetContent(); =09 System=2Eout=2Eprintln("Content: " + value); } } The code works fine for xmls of size 1-2 mb=2E But throws OutOfMemoryError - Java heap size error for file of size 5mb or more=2E The problem seems to be with getContent() method=2E=20 Please provide a way/solution to this problem=2E Also please suggest if there is some more optimized way of applying xsl and writing the output to a file=2E Regards, Richa -----Original Message----- From: Wolfgang Meier [mailto:wolfgangmm@gmail=2Ecom]=20 Sent: Monday, October 22, 2007 3:59 PM To: Richa Gupta ,Gurgaon Cc: Dannes Wessels; Exist-open@lists=2Esourceforge=2Enet Subject: Re: [Exist-open] problem while processing large xmls > Thanx for ur suggestion Dannes=2E But we have already configured eXist=20 > to use Saxon=2E > > We applied xsl on small xml files (of size upto 1 mb)=2E It worked fine= =2E > But while applying the same xsl on large data (say 10 - 15 mb), we get > OutOfMemoryError=2E 768mb memory should be more than sufficient for an XSL transformation on a small 15mb document=2E Are you executing the query on an idle database or are there other queries running in parallel? Wolfgang DISCLAIMER: ---------------------------------------------------------------------------= -------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and= intended for the named recipient(s) only=2E It shall not attach any liability on the originator or HCL or its= affiliates=2E Any views or opinions presented in=20 this email are solely those of the author and may not necessarily reflect= the opinions of HCL or its affiliates=2E Any form of reproduction, dissemination, copying, disclosure, modification,= distribution and / or publication of=20 this message without the prior written consent of the author of this e-mail= is strictly prohibited=2E If you have=20 received this email in error please delete it and notify the sender= immediately=2E Before opening any mail and=20 attachments please check them for viruses and defect=2E ---------------------------------------------------------------------------= -------------------------------------------- |
From: Christofer D. <mai...@c-...> - 2007-10-23 13:52:02
|
Sounds like the standard Cocoon task. Ever thought of letting cocoon do the transformations and eventually writing your own serializer? Think this way memory consumption should be allmost linear (ignoring the acutal query memory usage). Another suggestion would be do profile your xslts. I use oXygen to do this for me. This helps finding out the bottlenecks and reduce memory consumption by optimizing the transformation itself. Regards, Chris Richa Gupta ,Gurgaon schrieb: > Hi, > > We made changes in startup.sh (-Xms256000k -Xmx768000k) and > conf.xml (<db-connection cacheSize="48M" collectionCacheSize="128" > database="native" files="webapp/WEB-INF/data" free_mem_min="5" > pageSize="4096">). > > Our requirement is to apply a series of xsl on an xml in repository. The > output of one xsl is the input xml for the next xsl. For this output is > saved on the file system as a new xml file and then imported to eXist > repository. > We are using XQueryService to apply xsl on xml in repository. > XQueryService returns a ResourceSet. We iterate through the ResourceSet > and use getContent method to fetch the output into a string. This string > is then written to a file in file system. > > Below is the snippet of our code: > > ResourceSet result = service.query(query); > if(result != null){ > ResourceIterator resIterator = > result.getIterator(); > while(resIterator.hasMoreResources()) { > Resource r = > resIterator.nextResource(); > value = > (String)r.getContent(); > > System.out.println("Content: " + value); > } > } > > The code works fine for xmls of size 1-2 mb. But throws OutOfMemoryError > - Java heap size error for file of size 5mb or more. The problem seems > to be with getContent() method. > > Please provide a way/solution to this problem. Also please suggest if > there is some more optimized way of applying xsl and writing the output > to a file. > > > Regards, > Richa > > > > -----Original Message----- > From: Wolfgang Meier [mailto:wol...@gm...] > Sent: Monday, October 22, 2007 3:59 PM > To: Richa Gupta ,Gurgaon > Cc: Dannes Wessels; Exi...@li... > Subject: Re: [Exist-open] problem while processing large xmls > > >> Thanx for ur suggestion Dannes. But we have already configured eXist >> to use Saxon. >> >> We applied xsl on small xml files (of size upto 1 mb). It worked fine. >> But while applying the same xsl on large data (say 10 - 15 mb), we get >> > > >> OutOfMemoryError. >> > > 768mb memory should be more than sufficient for an XSL transformation on > a small 15mb document. Are you executing the query on an idle database > or are there other queries running in parallel? > > Wolfgang > > DISCLAIMER: > ----------------------------------------------------------------------------------------------------------------------- > > The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. > It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in > this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates. > Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of > this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have > received this email in error please delete it and notify the sender immediately. Before opening any mail and > attachments please check them for viruses and defect. > > ----------------------------------------------------------------------------------------------------------------------- > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > Exist-open mailing list > Exi...@li... > https://lists.sourceforge.net/lists/listinfo/exist-open > |