Good morning guys,

Thanks for your help. After a hectic weekend, I think I solved most of my problems.

The first problem was that my function wasn't continuous at small scales (as Ian pointed out). As a result,
the LM would converge immediately, since the first few guesses were very close to each other.
I solved this problem, by fixing the 3rd parameter and observed how the function behaves on very small scales.

The second problem was that even after the (x,y) parameters started to change, the radius stayed fixed. I found out
that it always converges to a local minimum. I made another small change in my cost function and the problem was solved.

All in all, it's working now, and there's no need to change any of the default parameters.
Now, I'm improving my cost function again, hopefully it'll still work!

Tamir

On 11/17/06, Ian Scott < ian.m.scott@student.manchester.ac.uk> wrote:
Tamir Yedidya wrote:
>
> Hi,
>
> I'm trying to use vnl_levenberg_marquardt in order to fit a circle to an
> image. So, I'm having 3 parameters (x,y,r).
> My initial guess is usually quite good. For example , my estimation can
> be (157,157,100) and the best fit is (150,150,100).
>
> However, I found out that when using the default settings the algorithm
> will converge to the initial guess.
> I've tried setting the epsilon_function to 0.005 or higher values and
> then the estimates start moving, the algorithm  converges,
> but not always to the correct answer. Then, on different settings I
> might have to set the epsilon_function
> to a different value for the algorithm to converge to the right value.
>
> My question is whether this is the right thing to do - setting epsilon?
> or maybe the default parameters should
> be good enough and the problem is with the cost function?

If you have problems with a standard optimiser, the first place to look
is always the cost function. In this case it is (the residual sum of
squares) along all three axes. It is also worth plotting 2D surfaces
from the response due to pairs of parameters.

Since you are using a residuals-based optimiser, it is also worth
checking that your residual behave correctly. Pick a residual vector r_1
at some position x_1, and then plot the projection of the residual onto
r_1, for different parameter values. This way you get to see if your
residual is behaving vaguely linearly around the minimum. Repeat for
several different x_1.

You may also need to look at the cost function at various scales of
changes in the parameters. A function that looks smooth at
x_0=[-3:0.1:3] might start to look very noisy and non-continuous at much
smaller scales x_0=[-3e-4:1e-5:3-e4]. A useful compromise is to have a
bi-log scale in x.

x=-3;
while (x < -1e-5)
{
   x=x/1.2;
   plot f(x);
}
x=1e-5;
while (x < 3)
{
   x=x*1.2;
   plot f(x);
}

Whether you are explicitly calculating the Jacobian (rather than letting
the cost function calculate it using finite differences), you may also
need to check that your gradients behave correctly. Again, pick a
gradient g_1 at a given x_1, and plot the magnitude of gradient for
different x, projected onto g_1. Again repeat for different x_1.

Finally (although it should have been first) - try the amoeba first. The
amoeba is the most robust, hassle-free, reliable optimiser about. After
it works, you can worry about using a faster optimiser (if speed turns
out to be a problem.) By this time you will understand a lot more about
your optimisation problem, allowing you to make more intelligent choices.

Ian.