In the IpoptSolver class I noticed the following input option for parametric NLP's:
NLP_P Only for parametric NLP - static parameters on which the objective and constraints might depend .
How do you define these parameters in the objective and constraint functions?
My actual problem is a bootstrapping parameter estimation problem. I have to solve the same PE problem thousands of times with only the datapoints changing. Now, I'm making a new IpoptSolver object every time, which is not really efficient.
In the code 'unknown' is the thing solved for and 'static' is treated as fixed.
unknown = casadi.ssym('unknown', 100)
static = casadi.ssym('static', 10)
# yadda yadda yadda
objective = casadi.SXFunction([unknown, static], [objective_function_expression])
constraint = casadi.SXFunction([unknown, static], [list_of_constraints])
solver = casadi.IpoptSolver(objective , constraint )
Thanks for the quick reply. I implemented it like this:
P = MX("P",len(tm),2*nx) # the parameters
V = MX("V",Xnp+Cnp) # the optimization variables
ffun = MXFunction([V,P],[f])
gfun = MXFunction([V,P],[g])
solver = IpoptSolver(ffun,gfun)
# Set options
but got the following error:
eval_h failed: on line 127 of file "/home/dominique/casadi/casadi/fx/sx_function_internal.cpp"
Cannot evaluate "function("unnamed_sx_function")" since variables [x_1_3,x_1_9,x_1_15,x_1_21,x_1_27,x_1_33,x_1_39,x_1_45,x_1_51,x_1_57,x_1_63,x_1_69,x_1_75,x_1_81,x_1_87,x_1_93,x_1_4,x_1_10,x_1_16,x_1_22,x_1_28,x_1_34,x_1_40,x_1_46,x_1_52,x_1_58,x_1_64,x_1_70,x_1_76,x_1_82,x_1_88,x_1_94,x_1_5,x_1_11,x_1_17,x_1_23,x_1_29,x_1_35,x_1_41,x_1_47,x_1_53,x_1_59,x_1_65,x_1_71,x_1_77,x_1_83,x_1_89,x_1_95] are free.
Am I doing something wrong?
I can't infer anything from the snippet. Perhaps a developer can.
If I use BFGS, by setting:
it's working perfectly. So I guess something's wrong in the way the hessian is calculated?
The support for parametric NLP:s is only experimental for the moment so I am not surprised that it fails to form the Hessian function correctly, although this particular problems appears to be simple to fix. Later this month I will do some concentrated work on this part of the code. Whether the "NLP_P" will be kept or not is still an open question.
My advice if you want to have parameters in your NLP is to declare them as normal NLP variables and then set both the upper and lower bound equal to your current parameter value.
I've added a ticket for this: https://sourceforge.net/apps/trac/casadi/ticket/451
Ok, thank you, it's working fine like that (i.e., with the parameters as extra variables).