Soumith Chintala: PyTorch
The Gradient: Perspectives on AI - A podcast by Daniel Bashir - Thursdays
Categories:
In episode 66 of The Gradient Podcast, Daniel Bashir speaks to Soumith Chintala.Soumith is a Research Engineer at Meta AI Research in NYC. He is the co-creator and lead of Pytorch, and maintains a number of other open-source ML projects including Torch-7 and EBLearn. Soumith has previously worked on robotics, object and human detection, generative modeling, AI for video games, and ML systems research.Have suggestions for future podcast guests (or other feedback)? Let us know here!Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on TwitterOutline:* (00:00) Intro* (01:30) Soumith’s intro to AI journey to Pytorch* (05:00) State of computer vision early in Soumith’s career* (09:15) Institutional inertia and sunk costs in academia, identifying fads* (12:45) How Soumith started working on GANs, frustrations* (17:45) State of ML frameworks early in the deep learning era, differentiators* (23:50) Frameworks and leveling the playing field, exceptions* (25:00) Contributing to Torch and evolution into Pytorch* (29:15) Soumith’s product vision for ML frameworks* (32:30) From product vision to concrete features in Pytorch* (39:15) Progressive disclosure of complexity (Chollet) in Pytorch* (41:35) Building an open source community* (43:25) The different players in today’s ML framework ecosystem* (49:35) ML frameworks pioneered by Yann LeCun and Léon Bottou, their influences on Pytorch* (54:37) Pytorch 2.0 and looking to the future* (58:00) Soumith’s adventures in household robotics* (1:03:25) Advice for aspiring ML practitioners* (1:07:10) Be cool like Soumith and subscribe :)* (1:07:33) OutroLinks:* Soumith’s Twitter and homepage* Papers* Convolutional Neural Networks Applied to House Numbers Digit Classification* GANs: LAPGAN, DCGAN, Wasserstein GAN* Automatic differentiation in PyTorch* PyTorch: An Imperative Style, High-Performance Deep Learning Library Get full access to The Gradient at thegradientpub.substack.com/subscribe