Search

Home > Machine Learning Street Talk > The Elegant Math Behind Machine Learning - Anil Ananthaswamy
Podcast: Machine Learning Street Talk
Episode:

The Elegant Math Behind Machine Learning - Anil Ananthaswamy

Category: Technology
Duration: 01:53:11
Publish Date: 2024-11-04 21:02:17
Description:

Anil Ananthaswamy is an award-winning science writer and former staff writer and deputy news editor for the London-based New Scientist magazine.


Machine learning systems are making life-altering decisions for us: approving mortgage loans, determining whether a tumor is cancerous, or deciding if someone gets bail. They now influence developments and discoveries in chemistry, biology, and physics—the study of genomes, extrasolar planets, even the intricacies of quantum systems. And all this before large language models such as ChatGPT came on the scene.


We are living through a revolution in machine learning-powered AI that shows no signs of slowing down. This technology is based on relatively simple mathematical ideas, some of which go back centuries, including linear algebra and calculus, the stuff of seventeenth- and eighteenth-century mathematics. It took the birth and advancement of computer science and the kindling of 1990s computer chips designed for video games to ignite the explosion of AI that we see today. In this enlightening book, Anil Ananthaswamy explains the fundamental math behind machine learning, while suggesting intriguing links between artificial and natural intelligence. Might the same math underpin them both?


As Ananthaswamy resonantly concludes, to make safe and effective use of artificial intelligence, we need to understand its profound capabilities and limitations, the clues to which lie in the math that makes machine learning possible.


Why Machines Learn: The Elegant Math Behind Modern AI:

https://amzn.to/3UAWX3D

https://anilananthaswamy.com/


Sponsor message:

DO YOU WANT WORK ON ARC with the MindsAI team (current ARC winners)?

Interested? Apply for an ML research position: benjamin@tufa.ai


Chapters:

00:00:00 Intro

00:02:20 Mathematical Foundations and Future Implications

00:05:14 Background and Journey in ML Mathematics

00:08:27 Historical Mathematical Foundations in ML

00:11:25 Core Mathematical Components of Modern ML

00:14:09 Evolution from Classical ML to Deep Learning

00:21:42 Bias-Variance Trade-off and Double Descent

00:30:39 Self-Supervised vs Supervised Learning Fundamentals

00:32:08 Addressing Spurious Correlations

00:34:25 Language Models and Training Approaches

00:35:48 Future Direction and Unsupervised Learning

00:38:35 Optimization and Dimensionality Challenges

00:43:19 Emergence and Scaling in Large Language Models

01:53:52 Outro

Total Play: 0