Download PDFOpen PDF in browser

SVGD: a Virtual Gradients Descent Method for Stochastic Optimization

EasyChair Preprint no. 1494

12 pagesDate: September 12, 2019

Abstract

Inspired by dynamic programming, we propose Stochastic Virtual Gradient Descent (SVGD) algorithm where the Virtual Gradient is defined by computational graph and automatic differentiation. The method is computationally efficient and has little memory requirements. We also analyze the theoretical convergence properties and implementation of the algorithm. Experimental results on multiple datasets and network models show that SVGD has advantages over other stochastic optimization methods.

Keyphrases: automatic differentiation, computational graph, machine learning, stochastic optimization

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:1494,
  author = {Zheng Li and Shi Shu},
  title = {SVGD: a Virtual Gradients Descent Method for Stochastic Optimization},
  howpublished = {EasyChair Preprint no. 1494},

  year = {EasyChair, 2019}}
Download PDFOpen PDF in browser