Search

Home > Data Science at Home > Make Stochastic Gradient Descent Fast Again (Ep. 113)
Podcast: Data Science at Home
Episode:

Make Stochastic Gradient Descent Fast Again (Ep. 113)

Category: Technology
Duration: 00:20:35
Publish Date: 2020-07-22 03:53:18
Description:

There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.

Join our Discord channel and chat with us.

 

References

 

Total Play: 0

Users also like

1K+ Episodes
a16z 100+     10+
7K+ Episodes
Les journaux .. 1K+     100+
400+ Episodes
Revolutions 2K+     50+
6 Episodes
RARE PERSPEC .. 5     1
2K+ Episodes
The Joe Roga .. 48K+     2K+

Some more Podcasts by Francesco Gadaleta

300+ Episodes
Data Science .. 10+     5