https://kr.mathworks.com/videos/optimization-in-matlab-82391.html
https://kr.mathworks.com/help/gads/optimization-workflow.html
To solve an optimization problem:
Decide what type of problem you have, and whether you want a local or global solution (see Local vs. Global Optima). Choose a solver per the recommendations in Table for Choosing a Solver.
Write your objective function and, if applicable, constraint functions per the syntax in Compute Objective Functions and Write Constraints.
Set appropriate options using optimoptions, or prepare a GlobalSearch or MultiStart problem as described in Workflow for GlobalSearch and MultiStart. For details, see Pattern Search Options, Particle Swarm Options, Genetic Algorithm Options, Simulated Annealing Options, or Surrogate Optimization Options.
Run the solver.
Examine the result. For information on the result, see Solver Outputs and Iterative Display or Examine Results for GlobalSearch or MultiStart.
If the result is unsatisfactory, change options or start points or otherwise update your optimization and rerun it. For information, see Global Optimization Toolbox Solver Characteristics or Improve Results. For information on improving solutions that applies mainly to smooth problems, see When the Solver Fails, When the Solver Might Have Succeeded, or When the Solver Succeeds.
Optimization is the process of finding the point that minimizes a function. More specifically:
A local minimum of a function is a point where the function value is smaller than or equal to the value at nearby points, but possibly greater than at a distant point.
A global minimum is a point where the function value is smaller than or equal to the value at all other feasible points.
Generally, Optimization Toolbox™ solvers find a local optimum. (This local optimum can be a global optimum.) They find the optimum in the basin of attraction of the starting point. For more information, see Basins of Attraction.
In contrast, Global Optimization Toolbox solvers are designed to search through more than one basin of attraction. They search in various ways:
GlobalSearch and MultiStart generate a number of starting points. They then use a local solver to find the optima in the basins of attraction of the starting points.
ga uses a set of starting points (called the population) and iteratively generates better points from the population. As long as the initial population covers several basins, ga can examine several basins.
particleswarm, like ga, uses a set of starting points. particleswarm can examine several basins at once because of its diverse population.
simulannealbnd performs a random search. Generally, simulannealbnd accepts a point if it is better than the previous point. simulannealbnd occasionally accepts a worse point, in order to reach a different basin.
patternsearch looks at a number of neighboring points before accepting one of them. If some neighboring points belong to different basins, patternsearch in essence looks in a number of basins at once.
surrogateopt begins by quasirandom sampling within bounds, looking for a small objective function value. surrogateopt uses a merit function that, in part, gives preference to points that are far from evaluated points, which is an attempt to reach a global solution. After it cannot improve the current point, surrogateopt resets, causing it to sample widely within bounds again. Resetting is another way surrogateopt searches for a global solution.
If an objective function f(x) is smooth, the vector –∇f(x) points in the direction where f(x) decreases most quickly. The equation of steepest descent, namely
yields a path x(t) that goes to a local minimum as t gets large. Generally, initial values x(0) that are close to each other give steepest descent paths that tend to the same minimum point. The basin of attraction for steepest descent is the set of initial values leading to the same local minimum.
The following figure shows two one-dimensional minima. The figure shows different basins of attraction with different line styles, and it shows directions of steepest descent with arrows. For this and subsequent figures, black dots represent local minima. Every steepest descent path, starting at a point x(0), goes to the black dot in the basin containing x(0).
The following figure shows how steepest descent paths can be more complicated in more dimensions.
The following figure shows even more complicated paths and basins of attraction.
Constraints can break up one basin of attraction into several pieces. For example, consider minimizing y subject to:
y ≥ |x|
y ≥ 5 – 4(x–2)2.
The figure shows the two basins of attraction with the final points.
Code for Generating the Figure
The steepest descent paths are straight lines down to the constraint boundaries. From the constraint boundaries, the steepest descent paths travel down along the boundaries. The final point is either (0,0) or (11/4,11/4), depending on whether the initial x-value is above or below 2.
최적화 변수를 만들고, 목적 함수와 제약 조건이 있는 문제를 만들고, solve를 호출
https://kr.mathworks.com/help/gads/problem-based-setup.html
Set Problem-Based Optimization Options for Global Optimization Toolbox Solvers
How to set and change optimization options in the problem-based approach for Global Optimization Toolbox.
Set Options in Problem-Based Approach Using varindex
To set options in some contexts, map problem-based variables to solver-based using varindex.
Pattern Search Options
Explore the options for pattern search.
Genetic Algorithm Options
Explore the options for the genetic algorithm.
Particle Swarm Options
Explore the options for particle swarm.
Surrogate Optimization Options
Explore the options for surrogate optimization, including algorithm control, stopping criteria, command-line display, and output and plot functions.
Simulated Annealing Options
Explore the options for simulated annealing.
솔버 선택, 목적 함수와 제약 조건 정의, 병렬로 계산
https://kr.mathworks.com/help/gads/optimization-problem-setup.html
가장 적합한 솔버와 알고리즘 선택
ga 유전 알고리즘을 사용하여 함수의 최솟값 구하기
gamultiobj Find Pareto front of multiple fitness functions using genetic algorithm
paretosearch Find points in Pareto set
particleswarm 입자 군집 최적화
patternsearch Find minimum of function using pattern search
simulannealbnd 담금질 기법 알고리즘을 사용하여 함수의 최솟값 구하기
surrogateopt Surrogate optimization for global minimization of time-consuming objective functions
GlobalSearch Find global minimum
MultiStart Find multiple local minima
최적화 라이브 편집기에서 방정식을 최적화하거나 풉니다. (R2020b 이후)
https://kr.mathworks.com/help/gads/example-comparing-several-solvers.html
This example shows how to minimize Rastrigin’s function with six solvers. Each solver has its own characteristics. The characteristics lead to different solutions and run times. The results, examined in , can help you choose an appropriate solver for your own problems.
fminunc
patternsearch
ga
particleswarm
surrogateopt
GlobalSearch
-----Compare Syntax and Solutions
Solution Objective # Fevals
__________________________ __________ ________
fminunc 19.899 29.849 12.934 15
patternsearch 19.899 -9.9496 4.9748 174
ga -0.0042178 -0.0024347 4.7054e-05 9453
particleswarm 9.9496 6.75e-07 0.99496 1140
surrogateopt -1.3383 -0.30217 3.5305 200
GlobalSearch -1.4046e-08 -1.4046e-08 0 2157
These results are typical:
fminunc quickly reaches the local solution within its starting basin, but does not explore outside this basin at all. fminunc has a simple calling syntax.
patternsearch takes more function evaluations than fminunc, and searches through several basins, arriving at a better solution than fminunc. The patternsearch calling syntax is the same as that of fminunc.
ga takes many more function evaluations than patternsearch. By chance it arrives at a better solution. In this case, ga finds a point near the global optimum. ga is stochastic, so its results change with every run. ga has a simple calling syntax, but there are extra steps to have an initial population near [20,30].
particleswarm takes fewer function evaluations than ga, but more than patternsearch. In this case, particleswarm finds a point with lower objective function value than patternsearch, but higher than ga. Because particleswarm is stochastic, its results change with every run. particleswarm has a simple calling syntax, but there are extra steps to have an initial population near [20,30].
surrogateopt stops when it reaches a function evaluation limit, which by default is 200 for a two-variable problem. surrogateopt has a simple calling syntax, but requires finite bounds. surrogateopt attempts to find a global solution, but in this case does not succeed. Each function evaluation in surrogateopt takes a longer time than in most other solvers, because surrogateopt performs many auxiliary computations as part of its algorithm.
GlobalSearch run takes the same order of magnitude of function evaluations as ga and particleswarm, searches many basins, and arrives at a good solution. In this case, GlobalSearch finds the global optimum. Setting up GlobalSearch is more involved than setting up the other solvers. As the example shows, before calling GlobalSearch, you must create both a GlobalSearch object (gs in the example), and a problem structure (problem). Then, you call the run method with gs and problem. For more details on how to run GlobalSearch, see Workflow for GlobalSearch and MultiStart.
To create the plot function for this example, copy and paste the following code into a new function file in the MATLAB® Editor: