118,70 €
131,89 €
-10% with code: EXTRA
Stochastics Perturbations of Global Optimization
Stochastics Perturbations of Global Optimization
118,70
131,89 €
  • We will send in 10–14 business days.
In this book, the global optimization of a nonconvex objective function is studied via stochastic perturbation. Stochastic perturbation is a method for the transformation of local minimization procedures in to global ones in the framework of continuous optimization. We have considered a general problem of unconstrained continuous and linear constraints optimization where the objective function may be nonsmooth. Standard meth-ods for smooth functions usually generate a descent direction by using…
  • SAVE -10% with code: EXTRA

Stochastics Perturbations of Global Optimization (e-book) (used book) | bookbook.eu

Reviews

Description

In this book, the global optimization of a nonconvex objective function is studied via stochastic perturbation. Stochastic perturbation is a method for the transformation of local minimization procedures in to global ones in the framework of continuous optimization. We have considered a general problem of unconstrained continuous and linear constraints optimization where the objective function may be nonsmooth. Standard meth-ods for smooth functions usually generate a descent direction by using the gradient and may be extended to nonsmooth situations by using a generalized gradient instead of the standard one whenever it is necessary. For instance, Clarke's generalized gradients may be used at the points where the objective function is not differentiable. According to this observation, we have considered a variable metric descent method and introduced suitable affine local approximations to be used. The projected variable metric descent method is considered for continuous optimization with linear constraints, and we have considered generalized reduced gradient (GRG) for nonlinear constraints optimization where the objective function is twice differentiable.

EXTRA 10 % discount with code: EXTRA

118,70
131,89 €
We will send in 10–14 business days.

The promotion ends in 16d.14:54:29

The discount code is valid when purchasing from 10 €. Discounts do not stack.

Log in and for this item
you will receive 1,32 Book Euros!?

In this book, the global optimization of a nonconvex objective function is studied via stochastic perturbation. Stochastic perturbation is a method for the transformation of local minimization procedures in to global ones in the framework of continuous optimization. We have considered a general problem of unconstrained continuous and linear constraints optimization where the objective function may be nonsmooth. Standard meth-ods for smooth functions usually generate a descent direction by using the gradient and may be extended to nonsmooth situations by using a generalized gradient instead of the standard one whenever it is necessary. For instance, Clarke's generalized gradients may be used at the points where the objective function is not differentiable. According to this observation, we have considered a variable metric descent method and introduced suitable affine local approximations to be used. The projected variable metric descent method is considered for continuous optimization with linear constraints, and we have considered generalized reduced gradient (GRG) for nonlinear constraints optimization where the objective function is twice differentiable.

Reviews

  • No reviews
0 customers have rated this item.
5
0%
4
0%
3
0%
2
0%
1
0%
(will not be displayed)