Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

Shai Shalev-Shwartz, Tong Zhang
2013
1 reference

Abstract

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

1 repository
1 reference

Code References

â–¶ tensorflow/tensorflow
1 file
â–¶ tensorflow/core/kernels/squared-loss.h
1
L26 // See page 23 of http://arxiv.org/pdf/1309.2375v2.pdf for the derivation of
Link copied to clipboard!