arrow
Online First
Mini-Batch Stochastic Conjugate Gradient Algorithms with Minimal Variance
Caixia Kou, Feifei Gao and Yu-Hong Dai

J. Comp. Math. DOI: 10.4208/jcm.2505-m2025-0004

Publication Date : 2025-06-24

  • Abstract

Stochastic gradient descent (SGD) methods have gained widespread popularity for solving large-scale optimization problems. However, the inherent variance in SGD often leads to slow convergence rates. We introduce a family of unbiased stochastic gradient estimators that encompasses existing estimators from the literature and identify a gradient estimator that not only maintains unbiasedness but also achieves minimal variance. Compared with the existing estimator used in SGD algorithms, the proposed estimator demonstrates a significant reduction in variance. By utilizing this stochastic gradient estimator to approximate the full gradient, we propose two mini-batch stochastic conjugate gradient algorithms with minimal variance. Under the assumptions of strong convexity and smoothness on the objective function, we prove that the two algorithms achieve linear convergence rates. Numerical experiments validate the effectiveness of the proposed gradient estimator in reducing variance and demonstrate that the two stochastic conjugate gradient algorithms exhibit accelerated convergence rates and enhanced stability.

  • Copyright

COPYRIGHT: © Global Science Press