Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent

Journal Article · · Communications on Applied Mathematics and Computation

Abstract

We prove, under mild conditions, the convergence of a Riemannian gradient descent method for a hyperbolic neural network regression model, both in batch gradient descent and stochastic gradient descent. We also discuss a Riemannian version of the Adam algorithm. We show numerical simulations of these algorithms on various benchmarks.

Sponsoring Organization:
USDOE
Grant/Contract Number:
SC0021142; SC0002722
OSTI ID:
2007675
Journal Information:
Communications on Applied Mathematics and Computation, Journal Name: Communications on Applied Mathematics and Computation Journal Issue: 2 Vol. 6; ISSN 2096-6385
Publisher:
Springer Science + Business MediaCopyright Statement
Country of Publication:
China
Language:
English

References (3)

A Gyrovector Space Approach to Hyperbolic Geometry book January 2009
Introduction to WordNet: An On-line Lexical Database* journal January 1990
Stochastic Gradient Descent on Riemannian Manifolds journal September 2013

Similar Records

Stochastic gradient descent for optimization for nuclear systems
Journal Article · Thu May 25 00:00:00 EDT 2023 · Scientific Reports · OSTI ID:2417878

Correspondence between neuroevolution and gradient descent
Journal Article · Tue Nov 02 00:00:00 EDT 2021 · Nature Communications · OSTI ID:1828630

Accelerating gradient descent and Adam via fractional gradients
Journal Article · Tue Jan 10 23:00:00 EST 2023 · Neural Networks · OSTI ID:2282013

Related Subjects