[AMPL 24711] The gradient and hessian of the Lagrangian function

Dear AMPL teams,

Sorry for your time. Currently im solving a nonconvex problem with equal and inequal constrains via “Ipopt” solver. Actually, i have had the statement" Optimal Solution Found" which means i got the local minimizer of the problem. And now, i want to evaluate the gradient and hessian of the Lagrangian function of the original problem at the local minimizer. But when i code the Lagrangian function according to the KKT condition and use the MATLAB function “gradient” and “hessian” to get the value. But the result of the gradient function of the Lagrangian function at the local minimizer doesn’t equal to zero and the reduced hessian matrix isn’t positive definite (SOSC violated). I am wondering that if i use the wrong function or misunderstand the Thereom. Could you please help me to solve the problem above or give me some advice?

Thanks a lot !

By Wang

It is easy to make a mistake in translating the KKT conditions to MATLAB, and unfortunately there is no easy way to find any mistake except to review your work carefully several times. As a start, you should check that your KKT conditions include all bounds on the variables that are specified in AMPL “var” statements; for purposes of writing the conditions, the bounds must be treated as additional constraints. Also be sure that your model does not use any nonsmooth functions (for example, abs, min or max).

As an alternative, there exists a package called “gjh” that asks AMPL to compute the gradient, jacobian, and hessian values. If that might be helpful for your work, write back and I can look to see where it might be available.