#432 backup scripts before executing sql

open
General (118)
5
2011-08-14
2011-08-04
Justin Pitts
No

I use SQuirreL largely because 1. It is JDBC and 2. It remembers the history of what I've run.

I deal in large datasets, and I prefer to use the tool in "running with scissors" mode - aka no result set limit. I know the danger. I've upped Xmx. It still bites me.

The part that annoys me most is that, when this bites me, I lose the history of that last thing I ran.

I would like for OutOfMemoryErrors to not cause SQuirreL to miss adding that last statement to my history.

Potential approaches?
1. Save before executing.
2. Don't let OOM happen.

#1 Sounds easy(er), and IMHO ought to be done in either case. I'm game to try to write that patch.
#2 Sounds harder, but maybe not impossible. A heuristic that checks available heap before iterating over the result set, and checks again every N rows could halt the fetch when a threshold of that initial available heap was consumed. This sounds very much like an opt-in feature to me, but I think it would eliminate many ( most?) of my crashes.

Discussion

    • assigned_to: manningr --> wis775
     
  • At the current Snapshot, I have fixed a bug, that the SQLHistory does't honor the following global setting: General => Save preferences immediately

    If this setting is enabled, the SQLHistory will be saved immediately after the execution of a statement.

    Please, give the next snapshot release (later then Snapshot-20110808_1747) a try, if this solve your primary problem.

    On the other hand, running into a OutOfMemoryError will be hard to prevent. Because if may not only occurs during reading the result set. I may also occur within swing dispatcher threads while updating the GUI and under other situations. So I will have a look, if we can free some memory when a OutOfMemoryError occurs - perhaps closing all result tabls in the active or all sessions. This should be enough, that the user can save his work (content of the SQL-Panels) and restart SQuirrel under known conditions. But this is still under investigation

    Stefan

     
  • We have implemented the described improvement by handling the OutOfMemoryError. We can't prevent them, but we can try to free some memory by closing all result tabs of all sessions. This may free enough memory, so that the user can save his work and restart the application (what is strongly recommend).
    Please give the latest snapshot a try. We are interested on your feedback.
    Thanks