Optimization techniques

If we classify the numerical optimization technique, which is based on the way of improving the design point after each iteration, there are three kinds of optimization techniques: non-gradient-based, gradient-based, and hybrid optimization techniques。 They are described briefly as  follows:

Non-gradient based optimization techniques do not require an objective function, f(x), to be differentiable because the

algorithms do not use derivatives of f(x)。 Examples of non-gradient-based optimization techniques are adaptive simulated annealing, Hooke-Jeeves direct search, and genetic algorithm (GA)。 These optimization techniques tend to reach a global optimum but require the huge number of function evaluations。 GA is a well-known non-gradient based optimization tech- nique。 It is a stochastic search or optimization algorithm that mimics Darwin’s theory of biological evolution。

Gradient-based techniques define the search directions by the gradient of the function at the current point。 In practice, there are many kinds of gradient-based optimization techniques such as generalized reduced gradient, conjugate gradient, method of feasible directions, mix integer optimization, sequential linear programming, sequential quadratic programming, and Davidon–Fletcher–Powell。 Gradient-based techniques, in general, give a quick convergence, but they may require a long run when the number of variables increases。 Gradient-based techniques can also get risk of local extremum for high nonlin- ear optimization problem。

Hybrid optimization techniques use the combination of both non-gradient based and gradient-based techniques subse- quently in order to take the advantages and reduce the disadvantages of single optimization technique。 Presenting all of these optimization techniques is beyond the scope of this    paper。

2。2。 The  common  optimization methods

The terminology optimization method used in this paper refers to whether or not the explicit objective functions are for- mulated。 For simulation-based optimization, the objective functions are often in the form of implicit equations。 The value of the objective function is unknown until simulation results are obtained。 There are two approaches that are used to resolve the optimization problem including direct optimization and metamodel-based optimization methods as shown in Fig。 1。 The detail of these two optimization methods is described as    follows。

2。2。1。 Direct optimization methods

Direct numerical optimization is an approach that explicit objective functions are not required。 Both gradient-based opti- mization techniques and non-gradient based optimization techniques can be applied to solve the optimization problem。 Sometimes, direct optimization methods combine the GA and other optimization techniques。 It is well known that GA   tends

Direct discrete optimization method

(No explicit mathematical functions showing the relationship between inputs and outputs)

Gradient-based None gradient-

optimization based

techniques optimization

(Derivative by techniques

finite difference)

Fig。  1。  Classification  of  optimization methods。

to reach a global extremum, but this method requires a large number of function evaluations。 On the contrary, gradient- based methods are efficient to guarantee a local extremum。 If these two algorithms are combined as a hybrid system, they can strengthen the advantages and remove the   disadvantages。

上一篇:刷电镀的更新英文文献和中文翻译
下一篇:汽车挡泥板注塑成型中能源效率英文文献和中文翻译

微注塑成型工艺参数对成...

数字通信技术在塑料挤出...

快速成型制造技术英文文献和中文翻译

注射成型薄壁注塑翘曲英文文献和中文翻译

注射成型的微悬臂梁结构英文文献和中文翻译

汽车挡泥板注塑成型中能...

Moldflow软件在复杂的塑料外...

基于Joomla平台的计算机学院网站设计与开发

STC89C52单片机NRF24L01的无线病房呼叫系统设计

上海居民的社会参与研究

压疮高危人群的标准化中...

浅谈高校行政管理人员的...

提高教育质量,构建大學生...

从政策角度谈黑龙江對俄...

酵母菌发酵生产天然香料...

浅论职工思想政治工作茬...

AES算法GPU协处理下分组加...