From: David Neubelt <david@re...>  20090310 19:12:13

Matthew, You can read up on moving least squares http://www.ams.org/mcom/199867224/S0025571898009740/S0025571898 009740.pdf For that matter, I believe least squares works in ndimensions I just never tried it. Also, discrete wavelets are used in image compression (essentialy a 2d function, f(x,y) = col) and I could imagine that they can be applied to your problem. Of course, wavelets are not smooth and require a lot of terms to approximate a quarter wave sinusoidal function. Though wavelets have a lot of really nice properties so it may be something to consider. What I've also done in the past is picked the strongest qualities of a function, such as at F(0) = 0, around F(0) looks like cos(theta), f'(1) = 0. I would then pick a polynomial that matches my number of constraints and solve for the coefficients. That usually gives really nice results because I'm choosing where the error should be really close and if I can choose a low number of constraints then I can have a low number of coeffecients which makes the approximation fast and accurate. Of course with this method I know what my function needs to look like so I can 'craft' the approximating function easily. If you're getting an unknown function this approach won't work. Shlick used this technique in his Fresnel approximation. Here is his example from his paper, For approximation sin(x) on the range[0, pi/2] f(0) = 0, f'(0) = 1, f(pi/2) = 1, f'(pi/2) = 0 // some constraints he said the function generally satisfies f(x) = (x^2 + ax + b)/(cx + d) // an approximating function The conditions leads to system of four equations {b = 0, {a = d, {a = pi/2 = c + 2a/pi {cpi^2 + 4pia + fa^2 giving the follow approximation of f(x) = sin x f(x) = x * (u + x) / (u + vx) with u = pi^2 / 4 and v = pi + u His experimental tests showed the relative error of 1.4% and speed up factor of 230%. Incidentally, his fresnel approximation had a .6% error and a 3180% speed up :) = Dave Original Message From: Matt J [mailto:mjohnson2005@...] Sent: Monday, March 09, 2009 6:11 PM To: Game Development Algorithms Subject: [Algorithms] Multivariant curve fitting How can I approximate a function such as func(a, b) = using polynomials? Right now I can use Maple and minimax to generate an approximation for any function that can be derived over a given domain. However it is not clear to me how I can extend that function into multiple variables, e.g. func(t, theta) , t = 0..1, theta = (PI, PI) My numerical approximation book covers least square fitting of nonlinear function for a single variable and a linear function for a multivariant function, so it seems like it may be possible to extend that to a multivariable nonlinear function. Is this the right approach or is there another way to do this? One issue with this approach is I'd have to generate a lot of data from my reference function (and it isn't clear how big dt or d(theta) steps would be to get an optimal solution) and I imagine it would be slow. Note that the latter isn't an issue, since I would generate it offline Matthew   _______________________________________________ GDAlgorithmslist mailing list GDAlgorithmslist@... https://lists.sourceforge.net/lists/listinfo/gdalgorithmslist Archives: http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithmslis t 