If you read articles about companies like OpenAI and Anthropic training foundation models, it would be natural to assume that if you don’t have a billion dollars or the resources of a large company, you can’t train your own foundational models. But the opposite is true.

In this episode of the Lightcone Podcast, we discuss the strategies to build a foundational model from scratch in less than 3 months with examples of YC companies doing just that. We also get an exclusive look at Open AI's Sora!

Podden och tillhörande omslagsbild på den här sidan tillhör Y Combinator. Innehållet i podden är skapat av Y Combinator och inte av, eller tillsammans med, Poddtoppen.