From: Etienne G. <et...@is...> - 2002-05-17 18:19:59
|
Hello, ok, I guess the obsolete things are really obsolete and can be cleaned out. I will rename bs_gradient2.m to bs_gradient.m (and modify bfgs accordingly). I will remove dfp.m (unfortunately, changing bs_gradient.m breaks dfp.m), so the bad news is we are loosing dfp.m. I will also remove __quasi_func__.m, which becomes useless. As far as derivatives are concerned, numerical derivatives will be the default (same as it was originally). A separated derivative function can be passed using the "df" option. If the same function, when asked for 2 output values, returns the derivative, then the "jac" option ("GradObj" in m*tlab) can tell bfgs.m to use that. Cheers, Etienne On Fri, May 17, 2002 at 12:31:22PM -0500, Ben Sapp wrote: # Hi Etienne, # # Please excuse the delayed response. # # On Friday 19 April 2002 01:18 pm, Etienne Grossmann wrote: # > this is to ask the opinion of Octave-Forge developers and users on # > what to do about the optimization functions : # > # > My opinion is to replace the bfgs.m and nrm.m files so that the new # > bfgs() function takes the same parameters as the actual cg_min(), and # > remove this last function. # # I agree. However, I think this does not go far enough. I think we should # also remove dfp.m, __quasi_func__.m and bs_gradient.m. We might also # rename the new function you created called bs_gradient2.m to bs_gradient.m or # something more appropriate. These become obsolete with the work you have # done. I think we should avoid keeping around obsolete things just for the # sake of keeping them around. # # If you agree and you'd like me to do any of this let me know. # # > I have verified on various quadratic programming cases and initial # > positions that my modified bfgs2() executes the same algorithm, in the # > same time (no extra overhead) as the original bfgs(). In terms of # > speed, it is thus much quicker than cg_min(). In terms of flexibility, # > it is better than bfgs(), since it can optimize wrt to any (not just # > the 1st) argument, and the termination criterion can be tweaked. # # Great! I think this reaffirms that the new work you have done makes a fair # amount of the old optimization stuff obsolete. # # > Another difference with the original bfgs(), which may be either an # > advantage or a pain, is that it requires the derivatives of the # > function to be provided by the user. This is an advantage because, if # > a hand-made derivative is available, it can be used instead of the # > numerical differentiation. This has the slight disadvantage that, if # > you want to use numerical differentiation, you should provide the # > function yourself, e.g. with cdiff(). # # At some point would you mind if I added a feature that if derivatives are not # supplied we go ahead and try to numerically approximate them? # # Cheers, # Ben. -- Etienne Grossmann ------ http://www.isr.ist.utl.pt/~etienne |