From: Matt J <mjohnson2005@gm...>  20090311 21:21:40

This sounds like a possible approach if you can extend the Remez algorithm to 2D because you can extend newton's algorithm to 2D as well. If a more perfect curve can be formed iteratively by looking for the local minimum or maximums in a 1D curve after sampling a certain number of points (why do I have a feeling this is related to Nyquist limit?), then why not extend that iterative method to a more perfect 2D surface by looking at the gradient (and Hessian matrix). I've never done this before, but it seems possible. > I saw this on Insomniac's R&D page awhile back, perhaps it would help: > > http://www.insomniacgames.com/tech/articles/0209/ramezexchange.php > > On Mon, Mar 9, 2009 at 8:10 PM, Matt J <mjohnson2005@...> wrote: > How can I approximate a function such as func(a, b) = using polynomials? > > Right now I can use Maple and minimax to generate an approximation for > any function that can be derived over a given domain. > > However it is not clear to me how I can extend that function into > multiple variables, e.g. func(t, theta) , t = 0..1, theta = (PI, PI) > > My numerical approximation book covers least square fitting of > nonlinear function for a single variable and a linear function for a > multivariant function, so it seems like it may be possible to extend > that to a multivariable nonlinear function. Is this the right > approach or is there another way to do this? One issue with this > approach is I'd have to generate a lot of data from my reference > function (and it isn't clear how big dt or d(theta) steps would be to > get an optimal solution) and I imagine it would be slow. Note that the > latter isn't an issue, since I would generate it offline > 