Search

Home > Data Skeptic > [MINI] Backpropagation
Podcast: Data Skeptic
Episode:

[MINI] Backpropagation

Category: Religion & Spirituality
Duration: 00:15:13
Publish Date: 2017-04-07 10:00:00
Description:

Backpropagation is a common algorithm for training a neural network.  It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network.  In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.

Total Play: 0

Users also like

300+ Episodes
Good Law | B .. 80+     10+
400+ Episodes
The Knowledg .. 200+     10+
1K+ Episodes
Entrepreneur .. 600+     50+