Search

Home > Linear Digressions > Multi - Armed Bandits
Podcast: Linear Digressions
Episode:

Multi - Armed Bandits

Category: Technology
Duration: 00:11:29
Publish Date: 2016-03-06 20:44:17
Description: Multi-armed bandits: how to take your randomized experiment and make it harder better faster stronger. Basically, a multi-armed bandit experiment allows you to optimize for both learning and making use of your knowledge at the same time. It's what the pros (like Google Analytics) use, and it's got a great name, so... winner! Relevant link: https://support.google.com/analytics/answer/2844870?hl=en
Total Play: 0

Some more Podcasts by Ben Jaffe and Katie Malone

200+ Episodes