From: Rick R. <rl...@fi...> - 2002-07-02 11:54:56
|
Well, i just tried this with a repast model, set up to run in batch mode (no gui comes up) as (i think) i outlined in a message a while back. (Its a model we can run with drone, at least.) And... hohup didn't stop it from being killed when i killed the ssh job! the second i killed the ssh, the repast run was stopped. I just tried it again with a swarm model, and that worked fine, i.e., i can kill the ssh and the swarm batch job goes on. So...there does seem to be something else going on with the repast model i used. I guess i'll do some more explorations... - r -- Rick Riolo rl...@um... Center for Study of Complex Systems (CSCS) 4477 Randall Lab University of Michigan Ann Arbor MI 48109-1120 Phone: 734 763 3323 Fax: 734 763 9267 http://www.pscs.umich.edu/PEOPLE/rlr-home.html On Tue, 2 Jul 2002, Rick Riolo wrote: > Date: Tue, 2 Jul 2002 07:20:43 -0400 (EDT) > From: Rick Riolo <rl...@fi...> > Reply-To: Rick Riolo <rl...@um...> > To: Jason Woodard <jwo...@hb...> > Cc: rep...@li... > Subject: Re: [Repast-interest] detached batch operation > > > I think the basic problem is in ssh (introduced a release > or two ago), and I believe you can get around this with nohup. > I know we had problems like this with swarm models, > tcl/tk scripts, perls scripts, emacs sessions, etc! > > I've not done this with a repast model, but with other > processes (eg swarm models) I can ssh to the remote machine, > then > hohup programToRun & > then when I exit ssh, it will not stop that remote job, > but (at least for me) the ssh will "wait". > But I can then close the xterm I started > the ssh in, or I can even kill that ssh job, > and the remote job will go on. > > Let us know how it works for you with repast. > Thanks, > - r > > > |