Search

Home > Data Science at Home > Episode 57: Neural networks with infinite layers
Podcast: Data Science at Home
Episode:

Episode 57: Neural networks with infinite layers

Category: Technology
Duration: 00:16:19
Publish Date: 2019-04-23 03:04:27
Description:

How are differential equations related to neural networks? What are the benefits of re-thinking neural network as a differential equation engine? In this episode we explain all this and we provide some material that is worth learning. Enjoy the show!

 

Residual Block

 

 

References

[1] K. He, et al., “Deep Residual Learning for Image Recognition”, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 770-778, 2016

[2] S. Hochreiter, et al., “Long short-term memory”, Neural Computation 9(8), pages 1735-1780, 1997.

[3] Q. Liao, et al.,”Bridging the gaps between residual learning, recurrent neural networks and visual cortex”, arXiv preprint, arXiv:1604.03640, 2016.

[4] Y. Lu, et al., “Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equation”, Proceedings of the 35th International Conference on Machine Learning (ICML), Stockholm, Sweden, 2018.

[5] T. Q. Chen, et al., ” Neural Ordinary Differential Equations”, Advances in Neural Information Processing Systems 31, pages 6571-6583}, 2018

Total Play: 0

Users also like

1K+ Episodes
a16z 100+     10+
3K+ Episodes
Les journaux .. 1K+     100+
300+ Episodes
Revolutions 2K+     50+
6 Episodes
RARE PERSPEC .. 5     1
1K+ Episodes
The Joe Roga .. 48K+     1K+

Some more Podcasts by Francesco Gadaleta

300+ Episodes
Data Science .. 10+     1