Effective Dimension Aware  Fractional-Order Stochastic Gradient Descent for Convex Optimization Problems

Authors

  • Mohammad Partohaghighi Author
  • Roummel Marcia Author
  • YangQuan Chen Author

DOI:

https://doi.org/10.29229/uzmj.2025-3-15

Keywords:

Fractional Calculus, Stochastic Gradient Descent, Two Scale Effective Dimension, More Optimal Optimization

Abstract

Fractional-order stochastic gradient descent (FOSGD) leverages fractional exponents to capture long-memory effects in optimization. However, its utility is often limited by the difficulty of tuning and stabilizing these exponents. We propose 2SED Fractional-Order Stochastic Gradient Descent (2SEDFOSGD), which integrates the Two-Scale Effective Dimension (2SED) algorithm with FOSGD to adapt the fractional exponent in a data-driven manner. By tracking model sensitivity and effective dimensionality, 2SEDFOSGD dynamically modulates the exponent to mitigate oscillations and hasten convergence. Theoretically, this approach preserves the advantages of fractional memory without the sluggish or unstable behavior observed in na\"ive fractional SGD. Empirical evaluations in Gaussian and $\alpha$-stable noise scenarios using an autoregressive (AR) model,
highlight faster convergence and more robust parameter estimates compared to baseline methods, underscoring the potential of dimension-aware fractional techniques for advanced modeling and estimation tasks.

Published

2025-09-05

How to Cite

Effective Dimension Aware  Fractional-Order Stochastic Gradient Descent for Convex Optimization Problems. (2025). Uzbek Mathematical Journal, 69(3), 142-152. https://doi.org/10.29229/uzmj.2025-3-15