A Derivative-free Two Level Random Search Method for Unconstrained Optimization A Derivative-free Two Level Random Search Method for Unconstrained Optimization
SpringerBriefs in Optimization

A Derivative-free Two Level Random Search Method for Unconstrained Optimization

    • US$49.99
    • US$49.99

출판사 설명

The book is intended for graduate students and researchers in mathematics, computer science, and operational research. The book presents a new derivative-free optimization method/algorithm based on randomly generated trial points in specified domains and where the best ones are selected at each iteration by using a number of rules. This method is different from many other well established methods presented in the literature and proves to be competitive for solving many unconstrained optimization problems with different structures and complexities, with a relative large number of variables. Intensive numerical experiments with 140 unconstrained optimization problems, with up to 500 variables, have shown that this approach is efficient and robust.
Structured into 4 chapters, Chapter 1 is introductory. Chapter 2 is dedicated to presenting a two level derivative-free random search method for unconstrained optimization. It is assumed that the minimizing function is continuous, lowerbounded and its minimum value is known. Chapter 3 proves the convergence of the algorithm. In Chapter 4, the numerical performances of the algorithm are shown for solving 140 unconstrained optimization problems, out of which 16 are real applications. This shows that the optimization process has two phases: the reduction phase and the stalling one. Finally, the performances of the algorithm for solving a number of 30 large-scale unconstrained optimization problems up to 500 variables are presented. These numerical results show that this approach based on the two level random search method for unconstrained optimization is able to solve a large diversity of problems with different structures and complexities.

There are a number of open problems which refer to the following aspects: the selection of the number of trial or the number of the local trial points, the selection of the bounds of the domains where the trial points and thelocal trial points are randomly generated and a criterion for initiating the line search.

장르
과학 및 자연
출시일
2021년
3월 31일
언어
EN
영어
길이
129
페이지
출판사
Springer International Publishing
판매자
Springer Nature B.V.
크기
18.3
MB
Modern Numerical Nonlinear Optimization Modern Numerical Nonlinear Optimization
2022년
Numerical Analysis for Engineers and Scientists Numerical Analysis for Engineers and Scientists
2014년
Numerical Methods for Roots of Polynomials - Part I Numerical Methods for Roots of Polynomials - Part I
2007년
Numerical Methods and Optimization Numerical Methods and Optimization
2022년
Splitting Algorithms, Modern Operator Theory, and Applications Splitting Algorithms, Modern Operator Theory, and Applications
2019년
Optimization and Applications Optimization and Applications
2021년
Modern Numerical Nonlinear Optimization Modern Numerical Nonlinear Optimization
2022년
Nonlinear Conjugate Gradient Methods for Unconstrained Optimization Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
2020년
Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology
2017년
Nonlinear Optimization Applications Using the GAMS Technology Nonlinear Optimization Applications Using the GAMS Technology
2013년
Intentional Risk Management through Complex Networks Analysis Intentional Risk Management through Complex Networks Analysis
2015년
BONUS Algorithm for Large Scale Stochastic Nonlinear Programming Problems BONUS Algorithm for Large Scale Stochastic Nonlinear Programming Problems
2015년
Topics in Matroid Theory Topics in Matroid Theory
2013년
Data Storage for Social Networks Data Storage for Social Networks
2012년
Demand Flexibility in Supply Chain Planning Demand Flexibility in Supply Chain Planning
2012년
Multiple Information Source Bayesian Optimization Multiple Information Source Bayesian Optimization
2025년