1.9.9 Freezes eclipse!
Status: Beta
Brought to you by:
mballance
Hello there,
After a whole day to make it work, I went back to 1.9.8:
Using 1.9.9 with exactly the same .f file as I was using in 1.9.8 just freezes eclipse and does not even start the SDBIndexBuildJob (stack in 0%) -> result files not indexed, eclipse keeps freezing and my machine gets realy slow!
Going back to 1.9.8 resolves my issue (nothing modified .f file).
NB: using a smaller *.f file (with much less files) does not cause any problem in 1.9.9
Thanks
Thanks for the report. Can I ask a couple of questions:
Does the same happen in a new workspace?
Are you able to narrow it down to a specific file(s)?
Yes, thank you for the report! Another question: does Eclipse seem to be
spinning (ie taking 99% CPU), or deadlocked (ie taking 0 CPU)? Is the GUI
completely locked up, or can you still open menus?
Thanks!
On Thu, Apr 20, 2017 at 8:10 AM, StevenAZ stevenaz@users.sf.net wrote:
Actually I did not check what happened exactly, but most probably eclpse keeps spinning (cannot say how much CPU was used because we're working on 32 core machine, and I guess there is a limit for every user/process)
However here is another symptome: I am using vnc server to connect to the machine. When the problem accurs, the file /home/username/.vnc/<machinename>.log keeps increasing strangely reaching up to 250GB!! blocking every user on the machine.
I have killed all eclipse process and removed the file, but it still appearing under the name .nfs00000... .
To stop this "apocalypse" I had to kill my vcn server to stop all my running processes.
> Does the same happen in a new workspace?
Yes same think wirh a new workspace
> Are you able to narrow it down to a specific file(s)?
Did not try this, did not won't to run in the error again because I blokced all people sharing the disk with me.
Same issue pops up in 2.0.0.1: you can see the following log:
PS: If I remember correcly "unique" was introduced in the code since 1.9.8
Nice find! I checked and the SVDBConstraintUniqueStmt class specifies its
object type as 'Foreach'. I've corrected this for the 2.0.1 release, and
will make a 2.0.0.2 release momentarily.
On Thu, Jun 29, 2017 at 9:37 AM, Safouan Ben Jha sbenjha@users.sf.net
wrote:
Thank you for the quick reply and fix.
However I am still wondering if the "[Error] Head is not cached" is also causing the fail. Actually I also tried to add step by step some files to the argument file, and the error (which is the SVBuilderIndexJob is blocking in 0%) is randomly happening:
- if I start by a small file list and add step by step some files, and rebuild index each time, it works fine until I add around 950 files, and then it fails
- I kill and restart eclipse and and go back to 900 files, click on rebuild indexing argument files, and it directly fails (even if it just passed in the previous attempt)
Do you think this can be related to the "unique" bug mentioned above?
I have updated to 2.0.0.2 and SVE works like a charm again :), many thanks.
You can set this topic to closed.
I hope that this resolves the issue, but I am suspicious ;-). If you encounter the 'hang' again, try following these instructions to detect thread deadlock using the 'jconsole' tool:
https://docs.pushtechnology.com/docs/5.7.6/manual/html/administratorguide/systemmanagement/jconsole_deadlocks.html
Hopefully the fix in 2.0.0.2 has resolved this issue. If not, please reopen the bug.
Well Matthew, you were right when you said you're suspicious. :-/
In fact the issue poped up again in, 2.0.1, with exactly the same project with the same files... that was working with 2.0.0.2
This time I can deliver more details about that:
The error message is:
[ERROR] Head is not cached
and it is printed thousands of times (until I kill eclipse) -> Infinite loop.I found that the msg is pintend when this piece of code is executed (line 522).
through jconsole, I get this output for the working thread:
Hope this helps identifying th issue.
PS: can you please set this issue status again to "open"
Thanks
Safouan
Last edit: Safouan Ben Jha 2017-07-11
Thanks for this additional information. The message "Head is not cached" does provide some clues, but it's also a bit of a head-scratcher ;-)
If you haven't already, please try creating your large project again using a fresh workspace. The error message involves the database cache that is stored inside the workspace, so I'm curious whether the database is somehow becoming corrupted. Reproducing this symptom starting from a brand-new workspace would at least provide another clue to what's going on...
Well I tried that now and got the same error message, and also found this cast fail again :
So opened the repo and checked you're commit to fix the cast issue, then opened the same file in the master branch, and found that the object type is correct (ConstraintUniqueStmt).
So went further by decompiling the jar file of release 2.0.1 and... tadaaa, I found that the previous error that it was supposed to be fixed is still present in current release:
You can check the screenshot below.
Cheers,
Safouan
Last edit: Safouan Ben Jha 2017-07-12
Another request please: Maybe it is better to prevent the "[ERROR] Head is not cached" from looping infinitly.
Then I would like to be able to downgrade to previous release (very much needed in my case now, since the bug is critical), because for the moment it is not possible to do so.
Mant thanks,
Safouan
Okay, I've made a few changes and made a 2.0.2 release. Please try this and let me know what you find.
I'm very confused by what you see when you decompile the 2.0.1 .jar file. As you note, the source files in GIT show the correct source code for the SVDBUniqueConstraintStmt class, as do the source files in my work area. I tested the 2.0.2 release downloaded from the update site, and it appears to behave correctly when unique constraints are specified.
Again, let me know what you find, and thanks in advance for your help!
Best Regards,
Matthew
Thank you very much for the fix, Now every thing is back to normal, all works as expected in 2.0.2.
I think you can close this topic again (for real hopefully)
Best Regards,
Safouan
That's excellent, Safouan, thanks for confirming!