Inference of Online Newton Methods with Nesterov's Accelerated Sketching
Researchers accelerate Newton's method for streaming data, cutting cubic cost to quadratic.
Researchers from the statistics and machine learning community have developed a new online Newton method that dramatically reduces the computational cost of second-order optimization for streaming data. The method, introduced in a 51-page paper on arXiv, combines Hessian averaging with a sketch-and-project solver enhanced by Nesterov's acceleration. This allows the algorithm to compute approximate Newton directions in O(d²) time and memory complexity, matching the efficiency of first-order methods while retaining the robustness of second-order approaches against ill-conditioning and noise heterogeneity.
The authors provide rigorous theoretical guarantees, including global almost-sure convergence, asymptotic normality of the last iterate, and a fully online covariance estimator with non-asymptotic convergence guarantees. They also connect their uncertainty quantification to exact and sketched Newton methods. Extensive experiments on regression models demonstrate that the method achieves superior online inference compared to existing approaches, making it a promising tool for reliable decision-making with streaming data.
- Achieves O(d²) time and memory complexity, matching first-order methods while using second-order updates.
- Proves global almost-sure convergence and asymptotic normality with a Lyapunov equation covariance.
- Develops a fully online covariance estimator with non-asymptotic convergence guarantees for uncertainty quantification.
Why It Matters
Enables robust, uncertainty-aware optimization for streaming data without the cubic cost of traditional Newton methods.