From: Daniel J S. <dan...@ie...> - 2004-05-10 06:15:24
|
Ethan A Merritt wrote: >On Sunday 09 May 2004 11:12 pm, Daniel J Sebald wrote: > > >>Ethan A Merritt wrote: >> >> >> >>>I tentatively propose to modify the time delay to be >>>#ifdef HAVE_USLEEP >>> usleep(100); >>>#else >>> sleep(1); >>>#endif >>> >>> >>Not completely comfortable with that for the simple fact that it is >>increasing the likelihood of a discarded character. >> >> > >No, I think I haven't explained it well enough. You will not lose >a character until waitforinput() gives up and leaves the loop >entirely. It is still possible to loop as many times as you like >before giving up. Total wait time = time per loop X # of iterations. >If you still have no response from the X server after several >seconds, I think losing a character is the smaller part of your >problem at that point. > >If things are functioning properly, then a loop like the one in >animate.dem will only go through the waitforinput() loop once >per replot even if there is unexpected terminal input. So making >the per-loop delay shorter should take care of the problem. > Oh, I see. The delay is simply to prevent the consumption of all the CPU cycles, not to solve the character being discarded. That problem was solved with the original wait protocol. Yeah, the #ifdef HAVE_USLEEP should do. Dan |