From: Ramon Gonzalez-A. <ar...@ma...> - 2005-12-14 01:38:50
|
Dear Sirs, A short first report on my tests with Foo. I have had the fellow synthesizing like a foo for hours. I have done mostly one type of test, because it is a heavy one and I had at hand lots of recent examples (actually from the piece I did in June) : processes of many, many synthesis with small contexts. Next step should bemore tests with large contexts too. Still I want to inform you of two items. First is that the first time I ran it with one of these tests, I soon got a Segmentation Fault. After that first time, I have had it running for hours (restarted three times the process), and it was always cool. Strange, though. Second is to comment on the memory. I have a Foo with a stack of 65536. When I start Foo, top shows me Virtual Memory : 136Mb, Res : 12Mb & Shared : 2744. After three hours of running (as I said, hundreds of small synthesis) I had Virtual: 1132Mb, Res : 693Mb & Shared : 2748. The gc was all the time :: 2403K of 65536K. So there is a constant increase in memory. Does this mean there is still a memory leak? At the end of the synthesis I kill the context, should I do something else? These tests were done with the Linux machine. One more thing. When I got the first Segmentation Fault, I tried to check with the old Foo, the one I had used in May/June with the hacks that Martin had done for me. It was impossible, it was doing a Segmentation Fault, the moment I would start the synthesis. Of course I have updated (or almost) the system in the meantime. Is this normal? All the best, Ramon |