|
From: Oti <oh...@ya...> - 2002-11-11 22:59:47
|
[ Robert Oschler ] > I tried to compile a python file using the following options: > > jythonc --deep --jar new.jar new.py > > After cranking away for several minutes, it terminated with an "out > of memory" message, and suggested increasing the heap size with the > "-mx" option. I then tried: > > jythonc -J-mx1000000 --deep --jar new.jar new.py > > And got the same error. I then tried 50M instead of 1M for the heap > size and still go an "out of memory" error. new.py pulls in the > Jython XML package structure, and the memory error always occurs > right after "processing codecs". Anybody know what I might be able > to do about this? Not exactly, but I also suffer from this problem: only in very rare cases, and definitely not jythonc related, but in an embedded interpreter. Up to now I tend to think that it is my code which is leaking, but you never know... You could try to set python.options.showJavaExceptions=true in the registry, and maybe get the stack trace of the exception caught by Jython. My feeling is that there is another restriction of the JVM, since in my case the size of the java process is far less than the -mx upper bound. Probably not very helpful, Oti. __________________________________________________ Do you Yahoo!? U2 on LAUNCH - Exclusive greatest hits videos http://launch.yahoo.com/u2 |