From: Christofer D. <mai...@c-...> - 2007-10-23 13:52:02
|
Sounds like the standard Cocoon task. Ever thought of letting cocoon do the transformations and eventually writing your own serializer? Think this way memory consumption should be allmost linear (ignoring the acutal query memory usage). Another suggestion would be do profile your xslts. I use oXygen to do this for me. This helps finding out the bottlenecks and reduce memory consumption by optimizing the transformation itself. Regards, Chris Richa Gupta ,Gurgaon schrieb: > Hi, > > We made changes in startup.sh (-Xms256000k -Xmx768000k) and > conf.xml (<db-connection cacheSize="48M" collectionCacheSize="128" > database="native" files="webapp/WEB-INF/data" free_mem_min="5" > pageSize="4096">). > > Our requirement is to apply a series of xsl on an xml in repository. The > output of one xsl is the input xml for the next xsl. For this output is > saved on the file system as a new xml file and then imported to eXist > repository. > We are using XQueryService to apply xsl on xml in repository. > XQueryService returns a ResourceSet. We iterate through the ResourceSet > and use getContent method to fetch the output into a string. This string > is then written to a file in file system. > > Below is the snippet of our code: > > ResourceSet result = service.query(query); > if(result != null){ > ResourceIterator resIterator = > result.getIterator(); > while(resIterator.hasMoreResources()) { > Resource r = > resIterator.nextResource(); > value = > (String)r.getContent(); > > System.out.println("Content: " + value); > } > } > > The code works fine for xmls of size 1-2 mb. But throws OutOfMemoryError > - Java heap size error for file of size 5mb or more. The problem seems > to be with getContent() method. > > Please provide a way/solution to this problem. Also please suggest if > there is some more optimized way of applying xsl and writing the output > to a file. > > > Regards, > Richa > > > > -----Original Message----- > From: Wolfgang Meier [mailto:wol...@gm...] > Sent: Monday, October 22, 2007 3:59 PM > To: Richa Gupta ,Gurgaon > Cc: Dannes Wessels; Exi...@li... > Subject: Re: [Exist-open] problem while processing large xmls > > >> Thanx for ur suggestion Dannes. But we have already configured eXist >> to use Saxon. >> >> We applied xsl on small xml files (of size upto 1 mb). It worked fine. >> But while applying the same xsl on large data (say 10 - 15 mb), we get >> > > >> OutOfMemoryError. >> > > 768mb memory should be more than sufficient for an XSL transformation on > a small 15mb document. Are you executing the query on an idle database > or are there other queries running in parallel? > > Wolfgang > > DISCLAIMER: > ----------------------------------------------------------------------------------------------------------------------- > > The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. > It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in > this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates. > Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of > this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have > received this email in error please delete it and notify the sender immediately. Before opening any mail and > attachments please check them for viruses and defect. > > ----------------------------------------------------------------------------------------------------------------------- > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > Exist-open mailing list > Exi...@li... > https://lists.sourceforge.net/lists/listinfo/exist-open > |