Prelimit Coupling and Steady-State Convergence of Constant-stepsize Nonsmooth Contractive SA
Abstracts of the 2024 ACM SIGMETRICS/IFIP PERFORMANCE Joint International Conference on Measurement and Modeling of Computer Systems(2024)
摘要
Motivated by Q-learning, we study nonsmooth contractive stochastic
approximation (SA) with constant stepsize. We focus on two important classes of
dynamics: 1) nonsmooth contractive SA with additive noise, and 2) synchronous
and asynchronous Q-learning, which features both additive and multiplicative
noise. For both dynamics, we establish weak convergence of the iterates to a
stationary limit distribution in Wasserstein distance. Furthermore, we propose
a prelimit coupling technique for establishing steady-state convergence and
characterize the limit of the stationary distribution as the stepsize goes to
zero. Using this result, we derive that the asymptotic bias of nonsmooth SA is
proportional to the square root of the stepsize, which stands in sharp contrast
to smooth SA. This bias characterization allows for the use of
Richardson-Romberg extrapolation for bias reduction in nonsmooth SA.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要