Connor Leahy has a wide-ranging chat with Jim about the state & future of Deep Learning. They cover the history of EleutherAI, how GPT-3 works, the dynamics & power of scaling laws, ideal sampling rates & sizes for models, data sets, EleutherAI's opensource GTP-Neo & GTP-NeoX, PyTorch vs TensorFlow, TPU's vs GPU's, the challenge of benchmarking & evaluations, quadradic bottlenecks, broad GTP-3 applications, Connors thoughts on Jim's proposed GPT-3 research project, untapped GPT-3 potential, OpenAI's move away from opensource, alignment, AI safety, the unknown future, and much more.

Podden och tillhörande omslagsbild på den här sidan tillhör .. Innehållet i podden är skapat av . och inte av, eller tillsammans med, Poddtoppen.