Search

Home > Linear Digressions > KL Divergence
Podcast: Linear Digressions
Episode:

KL Divergence

Category: Technology
Duration: 00:25:38
Publish Date: 2017-08-06 22:07:15
Description: Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!
Total Play: 0

Some more Podcasts by Ben Jaffe and Katie Malone

200+ Episodes