Work at SourceForge, help us to make it a better place! We have an immediate need for a Support Technician in our San Francisco or Denver office.
I would like to understand the math in the example. I have a question to the math.
196 # Add collocation equations to the NLP
197 [fk] = f.call([T[k,j], X[k,j], U[k]])
198 g.append(h*fk - xp_jk)
You use Gaussian integration formula with a transformation rule?
For my understanding you don't have forgot the factor 0.5?
198 g.append(0.5*h*fk - xp_jk)
Ah! I see it. My mistake.
You use a different transformation rule. They transform into the interval .
Hello! The maths of the collocation is described in the users guide. For more detail, including how to handle algebraic constraints, check Larry Biegler's new book on nonlinear programming. Also note that there is a second, more advanced example, dae_collocation.py.
Good luck! Joel
Thank you Joel
You're the best!
In the example biegler_10_1.py
# 112 State at final time
113 ZF = SX("ZF")
ZF is unused.
Thanks for feedback. I've cleaned up the biegler_10_1.py example. It was using some "old" (but still valid) syntax. The new version should be more readable, I hope.
Thank you very fast!
My Idea was an others:
# Collocated states
Z = ssym("Z",N,K+1)
ZF = ssym("ZF",1)
# Construct the NLP
x = veccat([vec(Z.T),ZF])
## Print the time points
t_opt = (N*(K+1) +1 ) * 
for i in range(N):
for j in range(K+1):
t_opt[j + (K+1)*i] = h*(i + tau_root[j])
t_opt[-1] = 1
at the current solution, in my case, the last interval is not drawn
g.append(Z[i+1,0] - rhs)
g.append(ZF - rhs)
OK, I guess you can extrapolate to get the solution at the end point. The example uses a Legendre basis polynomial and then the end point is not a variable in the NLP. The idea of the examples is not so much to provide "ready" OCP-discretizations. More to show the idea so that people can modify the discretization themselves. If you find errors in the examples, feel free to post patches. Regards, Joel
g.append((Z[i+1,0] if i<N-1 else ZF) - rhs)
I hope now all is not to be confused.
Thank you again