From: Michael <xm...@fr...> - 2009-08-29 22:35:46
|
Hello there to all Sox developers. I 've been playing with sox for a few days and I like the noise reduction results. I 've tried to make a noise reduction algorithm but i do believe i am not doing something right. I am using Windows and compilation of the entire tarball seems tricky. I am trying to take a look to the noiseprof/noisered sources and get the idea of sox's algorithm. Meanwhile, my own (not properly working) implementation consists of: * Getting the noise profile by FFTing the noise section. Magnitudes are stored as is, no scaling. * To subtract the noise, I make a loop in which I fft the signal, subtract the magnitude of the noise (by finding the magnitude of the signal at each FFT point, subtracting the noise magnitute, then recalculate the complex number using the new magnitude and the saved phase). Problem is that there is noise at the end of each FFT pass , so If i use a fft with 8192 points, there is noise or gibbs or whatever artifacts each 8192 bytes. Before I immerse completely into the source, would anyone explain to me in a quick way what SOX does ? How does it get the profile ? How does it apply it ? I am still scanning the source files but its a bid hard to read code that you are not familiar with the basic structure. Thanks for your answer. Sincerely yours, Michael. -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. |