From: Robert D. <rob...@gm...> - 2024-11-28 18:45:47
|
On Wed, Nov 27, 2024 at 6:27 PM Eduardo Ochs <edu...@gm...> wrote: > integrate_qagi(f,x,a,b) := quad_qagi(f,x,a,b)[1]$ > h(x) := integrate_qagi(f(t)*g(x-t), t, minf, inf); > > draw2d(xrange=[-4,4], > yrange=[-4,4], > proportional_axes=xy, > explicit(h(w),w,-4,4)); Almost there; in order to get this formulation to work, the evaluation of quad_qagi needs to be postponed until w has a numerical value. When you write explicit(h(w), ...), h is evaluated with the argument being the symbol w (not yet a number), and quad_qagi complains about that. I think if you try explicit(h, ...) (just the function name), or explicit(lambda([foo], h(foo)), ...) (wrap call to h in an unnamed function, since the body isn't evaluated), it will give the expected result. Now the real fun of this convolution business is in repeating the operation, and seeing that the result is more and more like a Gaussian bump (assuming you start with a distribution which has finite variance). I've revisited this problem from time to time over the years; some of my attempts are seen in this folder: https://github.com/maxima-project-on-github/maxima-packages/tree/master/robert-dodier/boxcar_convolution See in particular repeated_convolution.mac and its associated output repeated_convolution_plot.png. The plot is interesting because it shows a progression towards more and more Gaussian bumps, except the last one has a lot of noise at the upper end -- this is the consequence of adding together a large number of factors with large coefficients and differing signs. One of those cases in which the "exact" result is not so exact due to numerical evaluation! The scripts boxcar_convolution.mac and boxcar_convolution_simpler.mac (I guess I had at least two attempts) try to construct "nice" representations of the convolution result in terms of boxcar (i.e. constant on an interval) functions. I think that might be well-suited for your original problem -- let me know if you are interested in pursuing that. I can think of a couple of examples that would perhaps shed some light for pedagogical purposes. One would be a very non-Gaussian density such as exp(-x)*unit_step(x), and the other is the apparently-smooth 1/pi times 1/(1 + x^2). The former should converge to a Gaussian bump, since it has finite variance, while the latter should not (in fact we should find it is invariant under convolution), since it does not have finite variance. If you are interested, I will try to work out these examples. All the best, Robert |