A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization

Clément W. Royer, Michael O'Neill, Stephen J. Wright
2018
1 reference

Abstract

We consider minimization of a smooth nonconvex objective function using an iterative algorithm based on Newton's method and the linear conjugate gradient algorithm, with explicit detection and use of negative curvature directions for the Hessian of the objective function. The algorithm tracks Newton-conjugate gradient procedures developed in the 1980s closely, but includes enhancements that allow worst-case complexity results to be proved for convergence to points that satisfy approximate first-order and second-order optimality conditions. The complexity results match the best known results in the literature for second-order methods.

1 repository
1 reference

Code References

scikit-learn/scikit-learn
1 file
sklearn/utils/optimize.py
1
# See https://arxiv.org/abs/1803.02924, Algo 1 Capped Conjugate Gradient.
Link copied to clipboard!