On Thu, 28 Oct 2004, Daniel J Sebald wrote:
> So reducing that data up front might help. You can't just discard data
> points for fear of aliasing, perhaps something like averaging of the
> samples, i.e., reduce 2000 points to 200 points by averaging groups of
> ten. (I'm talking uniform sampling now.) Just a thought.
In fact, something like the 'smooth uniq' filter may already help with
that, if applied correctly. Unfortunately though, the modifications
of the data needed to make 'smooth uniq' do something useful in a case
like this are not possible for time data. Pawel would have to
plot 'data' using (int($1/5)*5):2
but time data don't allow extended using specs.
Eventually, though, the critical clue is that you need a circular
buffering strategy, and that has to be done outside gnuplot. I.e. instead
of always plotting the *entire* file, a true real-time plot should have a
sliding window that only plots the last N seconds' (or whatever the x axis
unit is) worth of data, or the data since the full minute at least 10
minutes ago, or something like that. But this has to be done outside
gnuplot: there's no 'every ::-500' or whatever to signify "select only the
last 500 datapoints".
Hans-Bernhard Broeker (broeker@...)
Even if all the snow were burnt, ashes would remain.