Search

Home > Brain Inspired > BI 139 Marc Howard: Compressed Time and Memory
Podcast: Brain Inspired
Episode:

BI 139 Marc Howard: Compressed Time and Memory

Category: Science & Medicine
Duration: 01:20:11
Publish Date: 2022-06-20 16:49:31
Description:

Check out my short video series about what's missing in AI and Neuroscience.

Support the show to get full episodes and join the Discord community.

Marc Howard runs his Theoretical Cognitive Neuroscience Lab at Boston University, where he develops mathematical models of cognition, constrained by psychological and neural data. In this episode, we discuss the idea that a Laplace transform and its inverse may serve as a unified framework for memory. In short, our memories are compressed on a continuous log-scale: as memories get older, their representations "spread out" in time. It turns out this kind of representation seems ubiquitous in the brain and across cognitive functions, suggesting it is likely a canonical computation our brains use to represent a wide variety of cognitive functions. We also discuss some of the ways Marc is incorporating this mathematical operation in deep learning nets to improve their ability to handle information at different time scales.

0:00 - Intro 4:57 - Main idea: Laplace transforms 12:00 - Time cells 20:08 - Laplace, compression, and time cells 25:34 - Everywhere in the brain 29:28 - Episodic memory 35:11 - Randy Gallistel's memory idea 40:37 - Adding Laplace to deep nets 48:04 - Reinforcement learning 1:00:52 - Brad Wyble Q: What gets filtered out? 1:05:38 - Replay and complementary learning systems 1:11:52 - Howard Goldowski Q: Gyorgy Buzsaki 1:15:10 - Obstacles

Total Play: 0