I know that the JVM is spending too much time with collection garbage when this error gets thrown. Nevertheless, I dont't want to turn off this security behavior because the application gets very slow after the error is thrown.
Greetings
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The problem wouldn't be limited to ucanaccess. More in general it's due to your application. See here:http://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded
"The rare cases where I've seen this happen is where some code was creating tons of temporary objects and tons of weakly-referenced objects in an already very memory-constrained environment."
So, the use of higher values of Xmx or the memory=false parameter, or -XX:-UseGCOverheadLimit (that may imply an excessive amount of CPU taken by the jvm process though) may help to solve.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The JDBC/ODBC Bridge doesn't use java in-memory technologies.
I meant it depends on what you're doing.
If you're opening more db files, the SingleConnection=true connection parameter gets the resources recovered (i.e. memory) when you close the connection. Because of the bigger free memory, the GC calls are less frequent and more effective. If you're opening a sigle db the SingleConnection=true parameter is irrelevant from this viewpoint: you need higher values of Xmx or the memory=false parameter.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi,
I tried to connect to a 280MB Access database. Java throws the following error:
I know that the JVM is spending too much time with collection garbage when this error gets thrown. Nevertheless, I dont't want to turn off this security behavior because the application gets very slow after the error is thrown.
Greetings
The problem wouldn't be limited to ucanaccess. More in general it's due to your application. See here:http://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded
"The rare cases where I've seen this happen is where some code was creating tons of temporary objects and tons of weakly-referenced objects in an already very memory-constrained environment."
So, the use of higher values of Xmx or the memory=false parameter, or -XX:-UseGCOverheadLimit (that may imply an excessive amount of CPU taken by the jvm process though) may help to solve.
Thanks for your reply!
I also read this stackoverflow article. Are you sure, that the problem is due to our application? Using the JDBC/ODBC Bridge never lead to this error.
Setting singleConnection=true seemed to help solving this issue.
The JDBC/ODBC Bridge doesn't use java in-memory technologies.
I meant it depends on what you're doing.
If you're opening more db files, the SingleConnection=true connection parameter gets the resources recovered (i.e. memory) when you close the connection. Because of the bigger free memory, the GC calls are less frequent and more effective. If you're opening a sigle db the SingleConnection=true parameter is irrelevant from this viewpoint: you need higher values of Xmx or the memory=false parameter.