%% Algorithmic Differentiation in Optimization Toolbox for Matlab % Show how the performance depends on whether the AD is involved or not. % Algorithmic differentiation (AD) is available for problem-based % formulations of optimization problems since R2020. If the traditional % solver-based formulation is used instead, the toolbox computes the % gradient numerically, unless explicitly provided. % % Based on https://blogs.mathworks.com/loren/2020/10/06/automatic-differentiation-in-optimization-toolbox/. %% Problem-based formulation with AD engaged x = optimvar('x'); y = optimvar('y'); fun = 100*(y - x^2)^2 + (1 - x)^2; unitdisk = x^2 + y^2 <= 1; prob = optimproblem("Objective",fun,"Constraints",unitdisk); x0.x = 0; x0.y = 0; [sol,fval,exitflag,output] = solve(prob,x0) %% [R,TH] = ndgrid(linspace(0,1,100),linspace(0,2*pi,200)); [X,Y] = pol2cart(TH,R); surf(X,Y,log(1+100*(Y - X.^2).^2 + (1 - X).^2),'EdgeColor','none') colorbar view(0,90) axis equal hold on plot3(sol.x,sol.y,1,'ro','MarkerSize',10) hold off %% problem = prob2struct(prob); problem.objective %% Solver-based formulation with FD engaged fun = @(x) 100*(x(2) - x(1)^2)^2 + (1 - x(1))^2; nonlcon = @circlecon; [x,fval,exitflag,output] = fmincon(fun,[0;0],[],[],[],[],[],[],nonlcon) function [c,ceq] = circlecon(x) c = x(1)^2 + x(2)^2 -1; % Compute nonlinear inequalities at x. ceq = []; % Compute nonlinear equalities at x. end