In this Fully-Connected episode, Daniel and Chris ponder whether in-person AI conferences are on the verge of making a post-pandemic comeback. Then on to BigScience from Hugging Face, a year-long research workshop on large multilingual models and datasets. Specifically they dive into the T0, a series of natural language processing (NLP) AI models specifically trained for researching zero-shot multitask learning. Daniel provides a brief tour of the possible with the T0 family. They finish up with a couple of new learning resources.

Podden och tillhörande omslagsbild på den här sidan tillhör Changelog Media. Innehållet i podden är skapat av Changelog Media och inte av, eller tillsammans med, Poddtoppen.