Re: [Algorithms] Gaussian blur kernels
Brought to you by:
vexxed72
|
From: Fabian G. <f.g...@49...> - 2009-07-16 17:30:05
|
> Fabian also mentioned Multirate Filter Banks -- in the audio case, the > equivalent is the polyphase filter AFAICR, and that's used quite heavily > in real time, because it actually saves cycles compared to first > filtering and then decimating. I think a properly implemented multirate > filter in a pixel shader might actually give you very good bang for the > cycle buck! The basic idea behind polyphase filters is that you're doing downsample(conv(a,b),2), where conv(a,b) is the convolution of a and b and downsample is the "elementary" downsample operator (point sampling - just throw away every second sample). So since you're ignoring every second sample generated by your gaussian filter anyway, there's not much point computing it in the first place. The math behind polyphase filters are the specifics of how to do just that. It's basically splitting both the signal and filter into "odd" and "even" parts and then writing the result as a sum of convolutions of these - but it's been a while, so don't ask me about the details :) Multirate filter banks are the filter bank equivalent of discrete wavelet transforms, really - you split into low and high frequency parts, then do an extra decomposition on the low level part of the result, and so on. Each level has half the sampling rate of the previous, hence the "multirate". Nobody forces you to just use exactly 2 filters per level though. For blurring, you just don't care about the high frequency parts, so you throw them away and assume they're all 0 during reconstruction. The result is a cascade of the form downsample -> downsample -> ... -> upsample -> upsample If you want to do it correctly, the upsampling filter should match your downsampling filter. If you don't and you're cheap :), you just use the bilinear upsampler you get for free. Cheers, -Fabian "ryg" Giesen |