Tight analyses for subgradient descent I: Lower bounds
Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/\sqrt{T})$. This matches a known upper bound of $O(\log (T)/\sqrt{T})$...
Saved in:
Main Authors: | Harvey, Nicholas J. A., Liaw, Chris, Randhawa, Sikander |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2024-07-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.31/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Function approximation method based on weights gradient descent in reinforcement learning
by: Xiaoyan QIN, et al.
Published: (2023-08-01) -
Variable-Parameter Impedance Control of Manipulator Based on RBFNN and Gradient Descent
by: Linshen Li, et al.
Published: (2024-12-01) -
Smoothing gradient descent algorithm for the composite sparse optimization
by: Wei Yang, et al.
Published: (2024-11-01) -
Forest fire risk assessment model optimized by stochastic average gradient descent
by: Zexin Fu, et al.
Published: (2025-01-01) -
Gradient descent Sarsa(?)algorithm based on the adaptive potential function shaping reward mechanism
by: Fei XIAO, et al.
Published: (2013-01-01)