Global Optimization Basics - USING OPTIMIZATION—PRACTICAL ESSENTIALS - Optimization in Practice with MATLAB for Engineering Students and Professionals (2015)

Optimization in Practice with MATLAB for Engineering Students and Professionals (2015)

PART 3


USING OPTIMIZATION—PRACTICAL ESSENTIALS

8


Global Optimization Basics

8.1 Overview

The objective of global optimization is to find the best local solution among the known or discoverable set of local optima. The best local optimum is called the global optimum. Formally, global optimization generally seeks a global solution to an unconstrained or constrained optimization problem. Global search techniques are often essential for many applications (e.g., advanced engineering design, data analysis, financial planning, process control, risk management, and scientific modeling). Highly nonlinear engineering models often entail multiple local optima.

A global optimization problem can be defined as

(8.1)

subject to

(8.2)

(8.3)

(8.4)

The function f (x) represents the objective function. The constraints include the inequality constraint function g(x), the equality constraint function h(x), and the side constraints.

The unique challenges presented by global optimization are discussed in Sec. 8.2. Three typical methods employed to solve global optimization problems are presented in Sections 8.3, 8.4, and 8.5. Section 8.6 illustrates how to solve global optimization problems using the MATLAB Global Optimization Toolbox. Helpful pertinent references for a more advanced treatment include (Refs. [1, 2]).

8.2 Practical Issues in Global Optimization

As explained in Chapter 2, objective functions can have global minima and local minima. When this occurs, the problem is classified as a multimodal optimization problem. The objective of global optimization is to find the global optima.

Figure 8.1 presents two different types of one-dimensional objective functions. The variable, x, is in the interval between 0 and 10. The function illustrated in Fig. 8.1(a) is unimodal. It only has one local minimum, which is its global minimum. The function shown in Fig. 8.1(b) is multimodal. The four points, x1*, x 2*, x 3*, and x*, are all local minima. The function value of x* is the lowest of all the local minima. It is the global minimum of the multimodal objective function.

Figure 8.1. Unimodal and Multimodal Objective Functions

The global minimum of the unimodal objective function in Fig. 8.1(a) can be found using the gradient-based optimization methods presented in Chapters 12 and 13. In the case of the multimodal objective function presented in Fig. 8.1(b), it is difficult to readily find the global minimum. Gradient-based optimization methods will simply yield one of the local minima, depending on the starting point used. Finding the global optimum is not guaranteed when using gradient-based algorithms.

Figure 8.2 provides the 3D surface mesh of a two-dimensional multimodal function. Note that this function has several local minima and one global minimum.

Figure 8.2. The 3D Surface Plot of a Multimodal Function

In practical global optimization problems, the situation can be more complicated and challenging than the above examples. Objective functions may be non-smooth or discontinuous. At some points or in certain intervals, the derivatives of objective functions may not be available or may be expensive to compute. For some real life engineering design problems, it is often challenging to determine whether a highly nonlinear function is unimodal or multimodal before starting the optimization. More importantly, even if a problem is known to be multimodal, is is generally not possible to know how many modes/optima there are. Gradient-based optimization methods, in their conventional forms, may not be appropriate for these global optimization problems. Derivative-free methods, such as evolutionary algorithms, may be more appropriate in these cases.

8.3 Exhaustive Search

Exhaustive search algorithms enumerate all of the possible candidates for optimization problems, and the best solution is the global optimum. Exhaustive search may be applicable to optimization problems that are comprised of a manageable number of variable combinations. The time required for conducting an exhaustive search may dramatically increase as the number of candidate solutions increases. The following example implements a global optimization problem exhaustive search process.

Example: Use exhaustive search to find the global optimum of the following optimization problem.

(8.5)

subject to

(8.6)

(8.7)

Each of the design variables has three possible values. In selecting x1 and x2 from the feasible sets, there are nine possible combinations. The nine combinations and their objective function values are listed in Table 8.1.

Table 8.1. Design Variable Combinations and Their Objective Function Values

Comparing the nine objective function values in Table 8.1, we find that the global minimum of f (x) is 5. The optimal solution is x1=3 and x2 = 2.

8.4 Multiple Start

The multiple start approach uses gradient-based methods to find local and global minima. It generates multiple starting points, and stores local and global solutions found during the search process.

Several variations of the multiple start method are available in the literature. The basic multiple start method consists of the following steps:

1.Generate multiple starting points. The points can be either uniformly distributed within predefined bounds or generated using a sampling algorithm.

2.Filter out the infeasible starting points. This step is optional. If applied, this step may help reduce the total computational expense required to find the corresponding feasible optima.

3.For each starting point, use a gradient-based optimization method to search for a local minimum.

4.Save the multiple local minima returned by Step 3.

5.Compare all the local minima. The local minimum with the lowest objective function value is considered the global minimum. However, since there is no guarantee that we have obtained the complete set of local minima, we cannot be certain that we have obtained the global minimum.

The function shown in Fig. 8.1(b) is a multimodal objective function. The multiple start method can be used to find its global optimum.

Example: The optimization problem shown in Fig. 8.1(b) is stated as follows. Use the multiple start method to find its global optimum.

(8.8)

subject to

(8.9)

Choose 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, and 10 as starting points. These eleven points are uniformly distributed within the specified bounds. Note that the practical implementation of the multiple start method can demand a significantly higher number of starting points. In this example, fmincon is used to solve the optimization problem corresponding to each starting point. The MATLAB codes, including the main file, the objective function file, and the constraint function file, are given below.

1.Main file

clear
clc

% Define Linear constraints
A=[ ]; B=[ ]; Aeq=[]; Beq=[];

% Define bounds constraints
LB=[0]; UB=[10];

% Define optimization options
options=optimset(’largescale’,’off’,’MaxFunEvals’, ...
200000,’display’,’off’,’MaxIter’,1e6);

% Define starting points
xsta = [0 1 2 3 4 5 6 7 8 9 10];
k=11;

% Optimization from 11 starting points
for i=1:1:k
x0=xsta(i)
[xopt,fopt(i)]=fmincon(’MS_Func’, x0, A, B, ...
Aeq, Beq, LB, UB, ’MS_cons’, options)
end

2.Objective function file

function [f]=MS_Func(x)

f = x+10*sin(2*x);

3.Constraint function file

function [C Ceq]=MS_cons(x)

C = [];
Ceq = [];

The optimization yields 11 local optima, some of which are redundant. They are listed in Table 8.2. In the table, x0 represents the starting points, x*represents the corresponding local optimal values of the variable, and f* (x)represents the corresponding local optima of f (x).

Table 8.2. Local Optima for 11 Starting Points

By comparing the local optima given by the multiple start method (as reported in Table 8.2), the global minimum is found to be -7.66. The corresponding optimal value of x is 2.33. It is important to note that, in a multiple start approach, since the individual optimization runs do not depend on each other, the runs can be executed in parallel. This strategy will allow you to significantly reduce the net computing lapse time by taking advantage of the parallel and distributed computing capabilities widely available today. MATLAB itself provides pertinent capabilities in the form of its “Parallel Computing Toolbox.”

8.5 Role of Genetic Algorithms in Global Optimization

Evolutionary algorithms are population-based optimization algorithms inspired by the principles of natural evolution. When compared with other optimization algorithms, the advantage of evolutionary algorithms is that they do not make limiting assumptions about the underlying objective functions. The objective functions are treated as black-box functions. Furthermore, the definition of objective functions does not require significant insight into the structure of the design space.

Darwin’s theory of evolution identified the principles of natural selection and survival of the fittest as driving forces behind biological evolution. His theory can be summarized as follows [3].

Variation

There is variation between individuals in a population.

Competition

Resources are limited. In such an environment, there will be a struggle for survival among individuals.

Offspring

Species have great fertility. They produce more offsprings than can grow to adulthood.

Genetics

Organisms pass genetic traits to their offspring.

Natural selection

Those individuals with the most beneficial traits are more likely to survive and reproduce.

Evolutionary algorithms are based on the principles of biological evolution described in Darwin’s theory. Genetic Algorithms (GAs) are a class of evolutionary algorithms, also known as population-based metaheuristic optimization algorithms. GA was first conceived by J.H. Holland in [4]. GA uses a population of solutions, whose individuals are represented in the form of chromosomes. The individuals in the population go through a process of simulated evolution to obtain the global optimum.

The GA repeatedly modifies a set of solutions or individuals in the course of its entire run. At each iteration, the genetic algorithm selects individuals from the current population to serve as parents based on certain criteria. The parents are then used to create the next generation of individuals, called children. Over successive generations, the population evolves toward an optimal solution or Pareto frontier, depending on the type of problems and the type of GA being used. The procedure for a genetic algorithm is illustrated in Fig. 8.3.

Figure 8.3. Procedure for a Genetic Algorithm

The terms used in the procedure in Fig. 8.3 are explained below:

Encoding

Encoding is a way to represent individual solutions in evolutionary algorithms. Typically, individual solutions are coded as a finite fixed length string. Binary numbers are usually used as codes. This string is also known in the literature as a chromosome. For example, a binary number, 10001, represents the decimal number 17. The conversion from the binary number to the decimal number is given by 1 × 24 + 0 × 23 + 0 × 22 + 0 × 21 + 1 × 20 = 16 + 1 = 17.

Initial population

The algorithm begins by generating a population of individuals in the design space. Prior to population initialization, the designer must choose the number of individuals in each population and the number of bits in the encodingprocess. Both of these decisions are extremely important in promoting the success of the GA-based optimization. For example, too large a population can lead to undesirable computational expense, while too small a population can lead to premature convergence to a local optimum or suboptimal point. Further description of these features and issues of GAs can be found in the foundational book on GA by Goldberg [5].

Evaluation

Computation of the objective values for the individual solutions.

Optimization criteria

The stopping criteria of the algorithm. Examples of stopping criteria include the number of generations, the computation time limit, and the function tolerance.

Fitness assignment

There are several choices of fitness assignment. In a rank-based fitness assignment, the individuals are sorted according to their objective values. It creates an order among the individuals.

Selection

A selection criterion filters out the candidate solutions with poor fitness and retains those with acceptable fitness to enter the reproduction process with a higher probability.

Reproduction

A new generation in the genetic algorithm is created through reproduction from the previous generation. Three mechanisms (elitist, crossover, and mutation) are primarily used to create a new generation.

Elitist

The individuals with the best fitness values in the current generation are guaranteed to survive in the next generation.

Crossover

In this technique, a part of the encoded string of one individual is exchanged with the corresponding string part of another individual. There are many approaches to performing the crossover operation. Suppose there are twoindividuals, 10101 and 11001. Exchange the first two bits of the two individuals. The offspring of the two individuals are 11101 and 10001.

Mutation

Mutated child solution is generated from a single parent by randomly reversing some bits from 0 to 1, or vice versa. For example, through mutation, the 2nd bit and the 5th bit of 10101 are reversed. The new offspring is 11100.

Best individual

The global optimum that satisfies the termination criteria.

The steps of GA are illustrated in more detail in Chapter 19. The multimodal function shown in Fig. 8.1(b) is solved using MATLAB GA Solver as follows.

Example: The optimization problem presented in Fig. 8.1(b) is stated as follows.

(8.10)

subject to

(8.11)

To set up this problem in MATLAB, using the ga command, three M-files are generated: a main file, an objective function file, and a constraint function file. Note that this file structure is similar to the one used with fmincon.

The main file contains the initializations, bounds, options, and the ga command. The objective function file contains the objective or the fitness function definition. The constraint file contains the nonlinear inequality and equalityconstraints. The files are reported below.

1.Main file

clear
clc

% Define Linear constraints
A=[ ]; B=[ ]; Aeq=[]; Beq=[];

% Define bounds constraints
LB=[0]; UB=[10];

% Number of design variables
nvars = 1;

% Optimization function ga
[x,fval] = ga(@GA_Func,nvars,A,B,Aeq,Beq,LB,UB,@GA_cons);

display(x)
display(fval)

2.Objective function file

function [f]=GA_Func(x)

f = x+10*sin(2*x);

3.Constraint function file

function [C Ceq]=GA_cons(x)

% Define inequality constraints
C = [];

% Define equality constraints
Ceq = [];

The global optimum of the objective function obtained by MATLAB is -7.6563. The optimum value of the variable is 2.3312. The result is the same as that obtained using the multiple start method.

This problem can also be solved using the ga command from the graphical user interface of the GA Solver. Please note that the folder that contains the fitness function and the nonlinear constraint function should be selected as the MATLAB Current Directory. Figure 8.4 shows how to set up the GA solver to solve the problem. This screen can be opened by typing optimtool( ′ga′) in the Command Window. Alternatively, you can type optimtool in the Command Window; then choose the ga solver option from the top left dropdown menu. Set @GA_Func as the fitness function. Set @GA_cons as the nonlinear constraint function. Since there is no nonlinear constraint for this problem, we can leave it blank. Set the number of variables as 1. Set the lower bound as 0 and the upper bound as 10.

After 5 iterations, the optimization is terminated. The global optimum value of the objective function is -7.6563.

Figure 8.4. MATLAB Genetic Algorithm Solver from the Global Optimization Toolbox

8.6 MATLAB Global Optimization Toolbox

The MATLAB Global Optimization Toolbox provides methods to search for global solutions to problems that contain multiple maxima or minima. The optimization solvers in the toolbox include global search, multiple start, pattern search, genetic algorithms, and simulated annealing solvers. Using the toolbox, one can select a solver and define an optimization problem. Genetic algorithms and pattern search solvers can also be customized. For genetic algorithms, initial population and fitness scaling options can be modified and parent selection, crossover, and mutation functions can be defined by the users. For pattern search, polling, searching, and other functions can be defined.

The MATLAB Multiobjective Genetic Algorithm Solver can be used to solve multiobjective optimization problems to generate Pareto frontiers. This solver can be used to solve either smooth or nonsmooth optimization problems, with or without the bound constraints and the linear constraints.

As illustrated in Sec. 8.5, MATLAB Global Optimization Toolbox can solve global optimization problems using the ga solver. It also provides other solvers to solve global optimization problems. The following steps are required to solve a global optimization problem using the toolbox.

1.Select a solver and define an optimization problem.

2.Set up and inspect the optimization options.

3.Run the optimization tool and visualize the intermediate and final results.

The toolbox includes a number of plotting functions for visualizing the optimization process and the results. These visualizations provide users with real-time feedback about the optimization progress. The toolbox also provides custom plotting functions for both genetic algorithms and pattern search algorithms.

While the optimization is running, one can change certain options to refine its solution and update the performance results in genetic algorithms, multiobjective genetic algorithms, simulated annealing, and pattern search solvers. For example, one can enable or disable the plot functions, the output functions, and the command-line iterative display during run time to view intermediate results and query solution progress, without the need to stop and restart the solver. The user can also modify the termination criteria to refine the solution progression or reduce the number of iterations required to achieve a desired tolerance based on run time performance feedback. An introductory webinar of the MATLAB global optimization toolbox is provided in Ref. [6].

If the Parallel Computing Toolbox is available, it can be used in conjunction with the Global Optimization Toolbox to reduce computation time using parallel processing. Built-in support for parallel computing accelerates theobjective and constraint function evaluation in genetic algorithms, multiobjective genetic algorithms, and pattern search solvers.

This chapter provided an introductory presentation of global optimization, with sufficient information to tackle elementary problems. This book also presents more advanced approaches that can be used in more practical contexts. The pertinent chapters are: Discrete Optimization Basics (Chapter 9), Discrete Optimization (Chapter 14), and Evolutionary Algorithms (Chapter 19).

8.7 Summary

Most engineering design problems are non-linear in nature, and the resulting nonlinear problems often involve multiple local optima. Global optimization is the process of identifying the best of these local optima, otherwise known as the global optimum, in such nonlinear problems. This chapter introduced the major global optimization approaches, starting with basic approaches such as exhaustive search to advanced approaches such as genetic algorithms.The chapter concluded with an overview of the MATLAB Global Optimization Toolbox.

8.8 Problems

Warm-up Problems

8.1Use the exhaustive search technique to find the global optimum for the following optimization problem.

(8.12)

subject to

(8.13)

(8.14)

8.2Use the multiple start method to find the global optimum for the following problem.

(8.15)

subject to

(8.16)

8.3Use GA to find the global optimum for the following problem.

(8.17)

subject to

(8.18)

Intermediate Problems

8.4Consider the following optimization problem. Solve it using the GA Solver in MATLAB, using both the m-files and the graphical user interface.

(8.19)

subject to

(8.20)

(8.21)

(8.22)

(8.23)

8.5Learn the peaks command from the MATLAB tutorial available on www.mathworks.com. In this optimization problem, the lower bounds of the two variables are -3 and their upper bounds are 3. Now use GA to determine the global maximum of the peaks function within the defined region. Generate a 3D plot of the peaks command using mesh, and label the global maximum on the plot. Turn in your M-file and the plots.

BIBLIOGRAPHY OF CHAPTER 8

[1]M. Locatelli and F. Schoen. Global Optimization: Theory, Algorithms, and Applications. SIAM, 2013.

[2]C. A. Floudas and P. M. Pardalos. Recent Advances in Global Optimization. Princeton University Press, 2014.

[3]S. Sumathi, T. Hamsapriya, and P. Surekha. Evolutionary Intelligence: An Introduction to Theory and Applications with MATLAB. Springer, 2008.

[4]J. H. Holland. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. MIT Press, 1992.

[5]D. E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Publishing Company, Inc, 1989.

[6]The MathWorks, Inc. Global optimization with MATLAB products. http://www.mathworks.com/videos/global-optimization-with-matlab-products-81716.html, 2011.