Search

Home > Unsupervised > Theory and Practice of Deep Neural Networks, with Daniel Soudry
Podcast: Unsupervised
Episode:

Theory and Practice of Deep Neural Networks, with Daniel Soudry

Category: Technology
Duration: 00:44:03
Publish Date: 2018-12-19 10:25:00
Description:

Daniel Soudry is an assistant professor and a Taub Fellow at the Department of Electrical Engineering at the Technion. His first work focsed on Neuroscience, attempting to understand how neurons work in the brain. He then continued to a post-doc at Columbia University, where he discovered his interest in both the practical concerns and theory of deep neural networks. This episode focuses on Daniel's research work on questions such as how to make neural network work with low numerical precision, and when are SVM and Logistic Regression the same thing? We also talk with him about his path in academia and the journey to discover his research interests.

Things we discussed in this episode:

  1. Soudry, E. Hoffer, M. Shpigel Nacson, S. Gunasekar, N. Srebro, "The Implicit Bias of Gradient Descent on Separable Data", ICLR + Accepted to JMLR, 2018.

https://en.wikipedia.org/wiki/Logistic_regression

https://en.wikipedia.org/wiki/Support_vector_machine

  1. Hoffer, R. Banner, I. Golan, D. Soudry, "Norm matters: efficient and accurate normalization schemes in deep networks", NIPS 2018 (Spotlight)

  2. Banner, I. Hubara, E. Hoffer, D. Soudry, “Scalable Methods for 8-bit Training of Neural Networks”, NIPS 2018.

Whole-brain imaging of neuronal activity in a larval zebrafish - https://www.youtube.com/watch?v=lppAwkek6DI

Simultaneous Denoising, Deconvolution, and Demixing of Calcium Imaging Data: https://www.cell.com/neuron/fulltext/S0896-6273(15)01084-3

Total Play: 0