RE: [Algorithms] Finding optimal transformations
Brought to you by:
vexxed72
From: Willem de B. <wd...@pl...> - 2005-05-17 13:05:03
|
No, you assume the objective function can be locally accurately=20 represented by a quadratic function (ie., the first 3 terms of its Taylor series). Then you perform some sort of Newton step to find the next best approximate point.=20 =20 So, at each iteration you calculate the gradient and Hessian (which are given in closed-form in the paper) at the current "best" point,=20 then base your next iteration point on that. =20 _____ =20 From: gda...@li... [mailto:gda...@li...] On Behalf Of Bill Baxter Sent: Tuesday, May 17, 2005 2:36 PM To: gda...@li... Subject: Re: [Algorithms] Finding optimal transformations =09 =09 On 5/17/05, Willem de Boer <wd...@pl...> wrote:=20 Hi Per, =09 "> > into one that can be solved by quadratic programming! Quadratic programming is pretty hardcore. Even special cases [...]" =09 Whoops, my bad. I didn't mean quadratic programming. I meant=20 the whole thing turns into optimising a quadratic function, with no constraints, and without having to introduce a degree of freedom for each added constraint. And you can find the optimum of an unconstrained quadratic function with just a single linear system solve, no? So that does sound pretty handy. =09 --bb =09 |