RE: [Algorithms] Finding optimal transformations
Brought to you by:
vexxed72
From: Peter-Pike S. <pp...@wi...> - 2005-05-19 00:14:16
|
This is along the lines of the ideas presented in the papers earlier in this thread. In particular, if you compute the SVD of the covariance matrix (or its eigenvectors - they are the same thing in this case), you kind of ignore the diagonal terms (which is the "scaling" that exists in the optimal 3x3 transform.) =20 Using the SVD it is easy to handle the degenerate case you mention as well... =20 -Peter-Pike Sloan =20 (The covariance matrix turns out to be p p^t that you use below. You never need to build the matrix p, you can build the covariance matrix directly though, it is simply the sum of the outer products of the points minus the means...) ________________________________ From: gda...@li... [mailto:gda...@li...] On Behalf Of Bill Baxter Sent: Wednesday, May 18, 2005 4:24 PM To: gda...@li... Subject: Re: [Algorithms] Finding optimal transformations =09 =09 Just thought I'd throw this in the mix since no one mentioned it. If you just need the optimal 3x3 _transformation_ period and you don't care whether it's SO(3), it's actually quite easy. =09 Put all the original points in a 3xN matrix p, and all the corresponding target points in 3xN matrix q. And subtract the centroid off both sets of points. =09 Then you basically want to find the 3x3 matrix, T, that solves: T p =3D q =09 Except generally it's overconstrained, and p is non-square so you can't invert it. But you can do this: T p p^t =3D q p^t T (p p^t)(p p^t)^-1 =3D q p^t (p p^t)^-1 T =3D q p^t (p p^t)^-1 =09 In other words just use the pseudoinverse of p, and that actually gives you the least squares solution. The nice thing is (p p^t) is just a 3x3 matrix so it's easy to invert. Of course if (p p^t) is singular, then you need a backup plan. That happens whenever all the p points are collinear or coplanar, so it's not something you can generally ignore. I'm not sure what the most appropriate backup plan is for those degenerate cases. Have to think about it some more. =20 =09 --bb =09 =09 =09 On 5/17/05, Bill Baxter <wb...@gm...> wrote:=20 Oh, ok, so it becomes a standard unconstrained nonlinear optimization problem then. It sounded like you were saying the objective itself was quadratic. I see now. =20 =09 So their main idea is just to take each Newton optimization step using a local parameterization of the rotation (like R0 * incrementalRotation(param[3])) rather than doing the whole optimization with a fixed parameterization (like R(param[3]) ), where 'param[3]' represents your favorite 3-parameter representation of rotations. So you could see the whole thing as not being so different from optimization on SO(3) with Euler angles, except they avoid the singularities by reparameterizing locally every step, and accumulating the progress made thus far into the R0 matrix. Makes sense. Not as spectactularly cool as it sounded intitially, though. =09 Coincidentally, I took a robotics course from the second author, and just about the same time he was writing that paper, it appears. Small world. =09 =09 =09 On 5/17/05, Willem de Boer < wd...@pl... <mailto:wd...@pl...> > wrote:=20 No, you assume the objective function can be locally accurately=20 represented by a quadratic function (ie., the first 3 terms of its Taylor series). Then you perform some sort of Newton step to find the next best approximate point.=20 =09 =09 |