From: Paulo A. G. F. <pau...@pr...> - 2004-12-16 14:16:57
|
Ben, Read my answers below. Paulo Afonso Graner Fessel Administrador de Ambiente e Sistemas UNIX pau...@pr... OWT Fone: +55 (11) 3038-6554 Fax: +55 (11) 3038-6508 http://www.primesys.com.br =20 =20 =20 =20 > I echo Yves comments about how to discard data and still get=20 > what you need. For example, if you take only 1 in 10 points,=20 > you may miss some vital spike in the graph. Get mean values off any data series will tend to remove spikes present among the values you get. This is valid both for nature-generated and computer-generated data. ;) > What is more important to you, to look good or be accurate? Yves already pointed out that for long range time graphs they don't need to be that accurate. So I think that they should look good AND take less time to be generated. Nowadays this is also an issue, as graphs for a long period take a long time to appear. > For instance a weeks data will be stored as a single value. =20 > Or what ever time period you like. Each time period would be=20 > represented by storage of the average, max, min and standard=20 > deviation of that time period. This would be great. This would allow us to generate graphs faster, with a nice look AND provide some clues about spikes that may have disappeared in the process of mean calculations. We could even include a link for plotting the full data whenever the STDDEV is above a specified value. ;) > On the graph you would select either the full data, or a=20 > channel of summary data.=20 > Would this be an answer for you? Oh yes, surely yes. > I have been trying to work on a new database format which is=20 > smaller and faster for full data. But this is going slowly. =20 > Would you and other members of this list prefer if we get the=20 > summary data working first? If it's possible to make it faster than the "new and improved database scheme", yes. A big yes, really. =20 []'s and thanks a lot for PerfParse. |