Search

Home > Data Science at Home > Activate deep learning neurons faster with Dynamic RELU (ep. 101)
Podcast: Data Science at Home
Episode:

Activate deep learning neurons faster with Dynamic RELU (ep. 101)

Category: Technology
Duration: 00:22:18
Publish Date: 2020-04-01 07:13:59
Description:

In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently.

This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.

Don't forget to join us on discord channel to propose new episode or discuss the previous ones. 

References

Dynamic ReLU https://arxiv.org/abs/2003.10027

Total Play: 0

Users also like

1K+ Episodes
a16z 100+     10+
7K+ Episodes
Les journaux .. 1K+     100+
400+ Episodes
Revolutions 2K+     50+
6 Episodes
RARE PERSPEC .. 5     1
2K+ Episodes
The Joe Roga .. 48K+     2K+

Some more Podcasts by Francesco Gadaleta

300+ Episodes
Data Science .. 10+     5