Back to Search Start Over

New Versions of Gradient Temporal Difference Learning

Authors :
Lee, Donghwan
Lim, Han-Dong
Park, Jihoon
Choi, Okyong
Publication Year :
2021

Abstract

Sutton, Szepesv\'{a}ri and Maei introduced the first gradient temporal-difference (GTD) learning algorithms compatible with both linear function approximation and off-policy training. The goal of this paper is (a) to propose some variants of GTDs with extensive comparative analysis and (b) to establish new theoretical analysis frameworks for the GTDs. These variants are based on convex-concave saddle-point interpretations of GTDs, which effectively unify all the GTDs into a single framework, and provide simple stability analysis based on recent results on primal-dual gradient dynamics. Finally, numerical comparative analysis is given to evaluate these approaches.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.04033
Document Type :
Working Paper