Today we’re speaking with three researchers: Karan Grewal, Abhi Iyer and Akash Velu, about multi-task learning and how their new brain-inspired approach can help tackle it.

We’ll be discussing what a task is, what exactly we mean by multi-task systems, distances between tasks, the difference between continual learning and multi-task learning, catastrophic forgetting, catastrophic interference and their causes, various approaches out there like context-dependent gating and synaptic intelligence, the role of scale, and sparsity, we’ll cover some basics of the brain like dendrites, proximal and distal dendrites, apical and basal dendrites, how active dendrites can help us solve the challenges of gradient interference in multi-task learning, how they attach dendrites to each of their neurons in their deep learning models, their various approaches to representing context vectors, the challenges of brain-inspired approaches to machine learning, and some speculation about the future of this line of research.

Some references:
Their paper can be found here: https://arxiv.org/abs/2201.00042

All three of them had a great appearance on the Yannic Kilcher YouTube channel here: https://youtu.be/smxwT82o40Y

They mentioned Hierarchical Temporal Memory as the foundation stone for a lot of their research: https://numenta.com/blog/2019/10/24/machine-learning-guide-to-htm

Podden och tillhörande omslagsbild på den här sidan tillhör AutoML Media. Innehållet i podden är skapat av AutoML Media och inte av, eller tillsammans med, Poddtoppen.

Senast besökta

The AutoML Podcast

Active Dendrites: Brain-inspired multi-task learning

00:00