Since the purpose of creating the image pyramid is for visualization, the best large-image tools use cubic or sinc interpolation to down-sample.   

 

Large images introduce a number of other issues.  Perhaps the most important in the near term is the ability to efficiently compute a histogram for the image intensities.  This should be done on a small fraction of the image pixels for speed, but at the same time the histogram should be globally valid, i.e. reflect the statistics for the entire image, not just a few contiguous blocks.  Also enough pixels have to be collected so that the histogram bin values are approximately the same from one random trial to the next.

 

In the old TargetJr days, we randomly selected blocks from the image until enough pixels are obtained to produce a valid histogram.  The upper and lower 5% bounds of the histogram area define the image display stretch limits.  Moreover, it is desirable to cache these histogram derived bounds in a file associated with the image so that the histogram only has to be computed once.   

Joe

 

 

      o----o----o----o----o----o----o

Professor of Engineering (Research)     

Barus and Holley Bldg., Room 351        

Brown University                                  

Providence, RI                                      

401-863-2655