From: Roger B. Dannenberg <rbd@cs...> - 2005-03-16 20:41:18
Yes, there's a bug either in the documentation or the code: the table =
limit is 100K samples, which is 400KB. The table size limit is a =
between giving users the freedom to do what they want and keeping =
alive. Before there was a limit, users would complain that Nyquist =
for no apparent reason. The reason was that Nyquist was trying to =
huge tables. In almost all cases, there was a better way to do things. =
convolution, I think you can break the convolution up into two or more
parts, perform them separately, and then shift and sum the results =
to get the correct result. I realize this is easy to say and somewhat
painful to code and test. And if you're going to allocate a bunch of big
tables, why not just increase the tables size? A 4MB memory allocation =
so much on most modern machines. But is that enough? What should the =
be? (Also, Nyquist doesn't do fast convolution, so the time is =
to M*N where M and N are the lengths of the two sounds to be convolved.)
I guess the thing to do is increase the max size to 1M and put in some =
reporting in case the allocation fails so when it does the user gets a
meaningful error report. I'll work on in.
Meanwhile, have you tried the convolution at 22kHz? 88K samples (4 =
at 22kHz) should fit into the current 100K sample limit.
Get latest updates about Open Source Projects, Conferences and News.