On the Convergence of a Class of Derivative-free Minimization Algorithms
1975) On the Convergence of a Class of Derivative-free Minimization Algorithms . Technical Report CS75005-R, Computer Science, Virginia Tech. (
A convergence analysis is presenzed for a general class of derivative-free algorithms for minimizing a function f(x) whose analytic form of the gradient and the Hessian is impractical to obtain. The class of algorithms accepts finite difference approximation to the gradient with step-sizes chosen according to the following rule: if x, 2 are two successive iterate points and h, h are the corresponding step-size, then two conditions are required. The algorithms also maintain an approximation to the second derivative matrix and require the change in x made by each iteration is subject to a bound that is also revised automatically. The convergence theorems have the features that the starting point xl needs not be close to the true solution and f(x) needs not be convex. Furthermore, despite of the fact that the second derivative approximation may not converge to the true Hessian at the solution, the rate of convergence is still Q-superlinear. The theory is also shown to be applicable to a modification of Powell's dog-leg algorithm.
|Item Type:||Departmental Technical Report|
|Keywords:||derivative-free, minimization, Q-superlinear convergence|
|Subjects:||Computer Science > Historical Collection(Till Dec 2001)|
|Deposited By:||Administrator, Eprints|
|Deposited On:||20 April 2006|