From: Richard F. <fa...@gm...> - 2022-07-31 00:11:23
|
there is an area of study called "automatic differentiation" which addresses techniques for taking an algorithm for computing f(x) at some point x0 and creating an algorithm for computing df(x) at a point x0. Google finds many hits. Oddly this is quite different from differentiating a symbolic expression. It is more closely allied with expansion of a function in a Taylor series. Your use of gradef seems to be defining the function a(x) as a solution to a differential equation that looks, at first sight, as unpleasant. I don't know if auto diff would be useful, or if a reformulation using Taylor series might help. Just 2 thoughts. RJF On Sat, Jul 30, 2022 at 3:32 PM Andrei Zorine <zoa...@gm...> wrote: > Gentlemen, > can DIFF function be taught to deal with a BLOCK generated by OPTIMIZE? > Consider an example: > > depends(a,x); > gradef(a,x,(a+x)/(a^2+a*x-1)); > diff( (x*a-a^2)/(a+x+1)^3,x); > diff( (x*a-a^2)/(a+x+1)^3,x,2); > > Here the second call produces a lengthy expression. If we ask for 20th > derivative instead, the answer will blow up memory. > Instead, the first I'd like to have something like > diff( optimize(diff((x*a-a^2)/(a+x+1)^3, x)), x) > in an efficient and concise form. > If I remember correctly, Maple can do this. > > Andrei Zorine > _______________________________________________ > Maxima-discuss mailing list > Max...@li... > https://lists.sourceforge.net/lists/listinfo/maxima-discuss > |