Menu

Gradient Test 2

Maarten Krol

Gradient Test with observations

./scripts/run_gradtest.py -s 2000,1,1 -e 2000,1,3 rc/nam1x1-dummy_tr.rc

The output looks like

:::csh
           alpha               J_bg              J_obs              J_tot                DJ1                DJ2         DJ2/DJ1        1-DJ2/DJ1
               0  3.18200000000e+03  2.80841136525e+03  5.99041136525e+03  6.42102987469e+03  0.00000000000e+00               0                1
             0.1  2.57788340204e+03  2.80269885421e+03  5.38058225625e+03  6.42102987469e+03  6.09829109001e+03      0.94973722    0.05026277575
            0.01  3.11869887676e+03  2.80782492878e+03  5.92652380554e+03  6.42102987469e+03  6.38875597148e+03      0.99497372   0.005026281429
           0.001  3.17564099304e+03  2.80835256975e+03  5.98399356279e+03  6.42102987469e+03  6.41780245961e+03      0.99949737   0.000502631999
          0.0001  3.18136381036e+03  2.80840548419e+03  5.98976929454e+03  6.42102987469e+03  6.42070710822e+03      0.99994973  5.026708842e-05
           1e-05  3.18193637815e+03  2.80841077713e+03  5.99034715528e+03  6.42102987469e+03  6.42099757206e+03      0.99999497  5.030756232e-06
           1e-06  3.18199363779e+03  2.80841130644e+03  5.99040494423e+03  6.42102987469e+03  6.42102660095e+03      0.99999949  5.098478015e-07
           1e-07  3.18199936378e+03  2.80841135937e+03  5.99041072315e+03  6.42102987469e+03  6.42102932034e+03      0.99999991  8.633487425e-08
           1e-08  3.18199993638e+03  2.80841136467e+03  5.99041130104e+03  6.42102987469e+03  6.42102841084e+03      0.99999977  2.279779939e-07

Now what does the gradient test do? Like the name suggests it tests if the calculated gradient (by the adjoint TM5 model) is correct. It does this by exploiting the following math:

Given a state x0 the TM5 model calculates the cost

Jtot(x0) = Jbg(x0) + Jobs(x0)

5.99041136525e+03 = 3.18200000000e+03 + 2.80841136525e+03

At this x0, the gradient of the cost function with respect to x0: g0 is calculated by the adjoint model.

To check g0, the norm is calculated: DJ1 = <g0,g0>.

DJ1 = 6.42102987469e+03

The cost function is now evaluated for different values of α (0.1, 0.01, 0.001, etc.).

By Taylor expansion:

J(x0 + α·g0) = J(x0) + α·g0·∂J(x0)/∂x0 + O(α2)

Since ∂J(x0)/∂x0 = g0, this rearranges to

DJ2 = (J(x0 + α·g0) - J(x0))/α = <g0,g0> + O(α) = DJ1 + O(α)

So if α↓0, DJ2 should approach DJ1, and 1 - DJ1/DJ2 should go to zero.

It appear that this is the case. Also the convergence is linear in α, as it should be.

Thus we can conclude that the gradient calculated by the adjoint TM5 model is correct!


Related

Wiki: Home
Wiki: Observations

MongoDB Logo MongoDB