[Algorithms] Gaussian blur kernels
Brought to you by:
vexxed72
|
From: Joe M. <jo...@ga...> - 2009-07-16 00:48:05
|
Hey Guys, I was just wondering if there are specific mappings with regards the relationships between Gaussian kernel sizes, data set sizes, and number of passes. For example, what relationship exists between a 5x5 blur done over, say, a 1024x1024 texture and a 5x5 blur on a 512x512 texture of its downsampled data (assuming a simple average to downsample to the lower res)? Would the 5x5 over the lower res texture be approximately equivalent to a 9x9 at the higher res for example, or perhaps something along these lines? Similarly, what about running the 5x5 (or whatever size) N times in succession over the same texture? Is there a mapping that comes into play here, like, "X successive passes with a kernel of NxN is approximately equal to 1 pass with kernel of MxM"? If there are very specific mathematical relationships with proper formulae, that's great, but even if there aren't, I'd be just as happy with general rules of thumb, best practices based on experience, etc.. Thanks so much in advance for any insight you can provide! Joe Meenaghan The Game Institute jo...@ga... |