I'm mining information from an Sybase IQ database (ASA). Some of these queries produce large results.
I've noticed in some cases that SQuirrel's memory usage bar hits 254 out of 254Mb and seems to be unable to allocate any more memory.
At this point it usually becomes unresponsive and just hangs, from where I have to kill the client. Therefore the client should either be able to allocate more memory (maybe a configuration setting) or handle the out of memory condition more gracefully.
When it's close to memory exhaustion it's normally "building results". If I try to cancel this it doesn't always work, and still runs out of memory.
I'm running on Windows XP SP2.
Logged In: YES
user_id=1287991
Originator: NO
In squirrel-sql.bat the last line is "start "SQuirreL SQL Client" /B "%LOCAL_JAVA%w" -Xmx256m..."
To handle larger data sets, you can increase the maximum memory to say, 512 MB (or 1024 if you have 1GB of RAM) by changing
-Xmx256m
to
-Xmx512m
or
-Xmx1024m
Rob
Logged In: NO
I have seen the same with Oracle on Windows XP SP2.
What is strange is that it doesn't seem to free the memory. It always goes up and up until the limit of 254 Mb (I manage to do it in less than 10 queries). Could it be a memory leak ?
Logged In: YES
user_id=1287991
Originator: NO
"What is strange is that it doesn't seem to free the memory. It always goes
up and up until the limit of 254 Mb"
That is expected behavior. The JVM allocates more memory as required. If there is free memory on the heap (already allocated), it uses that. However, if there is no more free memory on the heap for new objects, it allocates more memory and adds it to the heap. When objects are garbage-collected, they are removed from the heap, but the heap never shrinks. This is intended behavior for the JVM.
Rob