Quantcast
Channel: MoneyScience: All news items
Viewing all articles
Browse latest Browse all 4772

Stochastic Gradient Descent in Continuous Time: A Central Limit Theorem. (arXiv:1710.04273v1 [math.PR])

$
0
0

Stochastic gradient descent in continuous time (SGDCT) provides a computationally efficient method for the statistical learning of continuous-time models, which are widely used in science, engineering, and finance. The SGDCT algorithm follows a (noisy) descent direction along a continuous stream of data. The parameter updates occur in continuous time and satisfy a stochastic differential equation. This paper analyzes the asymptotic convergence rate of the SGDCT algorithm by proving a central limit theorem for strongly convex objective functions and, under slightly stronger conditions, for non-convex objective functions as well. An L$^p$ convergence rate is also proven for the algorithm in the strongly convex case.


Viewing all articles
Browse latest Browse all 4772

Trending Articles