Search

Home > Data Skeptic > [MINI] The Vanishing Gradient
Podcast: Data Skeptic
Episode:

[MINI] The Vanishing Gradient

Category: Religion & Spirituality
Duration: 00:15:16
Publish Date: 2017-06-30 10:00:00
Description:

This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.

Total Play: 0

Users also like

300+ Episodes
Good Law | B .. 80+     10+
400+ Episodes
The Knowledg .. 200+     10+
1K+ Episodes
Entrepreneur .. 600+     50+