
Soumith Chintala: PyTorch
On the past and present of machine learning frameworks, and the story of PyTorch and its creator.
In episode 66 of The Gradient Podcast, Daniel Bashir speaks to Soumith Chintala.
Soumith is a Research Engineer at Meta AI Research in NYC. He is the co-creator and lead of Pytorch, and maintains a number of other open-source ML projects including Torch-7 and EBLearn. Soumith has previously worked on robotics, object and human detection, generative modeling, AI for video games, and ML systems research.
Have suggestions for future podcast guests (or other feedback)? Let us know here!
Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSS
Follow The Gradient on Twitter
Outline:
(00:00) Intro
(01:30) Soumith’s intro to AI journey to Pytorch
(05:00) State of computer vision early in Soumith’s career
(09:15) Institutional inertia and sunk costs in academia, identifying fads
(12:45) How Soumith started working on GANs, frustrations
(17:45) State of ML frameworks early in the deep learning era, differentiators
(23:50) Frameworks and leveling the playing field, exceptions
(25:00) Contributing to Torch and evolution into Pytorch
(29:15) Soumith’s product vision for ML frameworks
(32:30) From product vision to concrete features in Pytorch
(39:15) Progressive disclosure of complexity (Chollet) in Pytorch
(41:35) Building an open source community
(43:25) The different players in today’s ML framework ecosystem
(49:35) ML frameworks pioneered by Yann LeCun and Léon Bottou, their influences on Pytorch
(54:37) Pytorch 2.0 and looking to the future
(58:00) Soumith’s adventures in household robotics
(1:03:25) Advice for aspiring ML practitioners
(1:07:10) Be cool like Soumith and subscribe :)
(1:07:33) Outro
Links:
Soumith Chintala: PyTorch
Well, GANs were not a fad per se, but they found their place as a niche tool for generative tasks. Over-investment into GAN papers was a fad, true.
Fair enough, that seems closer to the truth—definitely an over-investment. It seems like there were some important lessons to learn from working on GANs, but those lessons didn’t require quite as much investment as they got