|
From: Ethan A M. <merritt@u.washington.edu> - 2006-02-22 16:20:30
|
I don't have time to do real profiling at the moment, but a quick test under linux handles a 1280x1024 image in 2.36 seconds using the current code, and 2.40 seconds using your modified code. I.e. no significant difference. echo "plot '< convert xxx.jpeg avs:-' binary filetype=avs with rgbimage" > bench.gnu time gnuplot bench.gnu 2.118u 0.260s 0:02.36 100.4% 0+0k 0+0io 0pf+0w time gnuplot_new bench.gnu 2.129u 0.257s 0:02.40 98.7% 0+0k 0+0io 0pf+0w But I agree that the current arbitrary limit of 1000 additional bytes is foolish. On Wednesday 22 February 2006 06:37 am, Bastian Maerkisch wrote: > When drawing large images (binary and ascii) gnuplot is veeeery > slow on my (windows) machine. E.g. with an image (RGB) of 1024*256 > pixels it takes about >5.5s to output on any terminal. > > Profiling reveals that gnuplot spends >75% of it's time in > gp_realloc() <-- cp_extend() <-- get_data(). > > The crucial point is that the maximum increase of the points[] > array is limited to 1000. Changing line 411 in plot2d.c > > - cp_extend(current_plot, i + (i < 1000 ? i : 1000)); > + cp_extend(current_plot, i + i); > > eliminates that problem, but uses exceedingly large amounts > of memory. > > Can anybody confirm this behaviour on non-windows machines? > Or is this a problem of gp_realloc() on Windows? > > Bastian > -- Ethan A Merritt Biomolecular Structure Center University of Washington, Seattle 98195-7742 |