Miles Brundage on AI Misuse and Trustworthy AI
An interview with Miles Brundage, Head of Policy Research at OpenAI and a researcher passionate about the responsible governance of artificial intelligence
In episode 17 of The Gradient Podcast, we talk to Miles Brundage, Head of Policy Research at OpenAI and a researcher passionate about the responsible governance of artificial intelligence.
Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSS
Economic Possibilities for Our Children: Artificial Intelligence and the Future of Work, Education, and Leisure
The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation
All the News that’s Fit to Fabricate: AI-Generated Text as a Tool of Media Misinformation
Toward Trustworthy AI Development: Mechanisms for Supporting Verifiable Claims
(01:05) How did you get started in AI
(07:05) Writing about AI on Slate
(09:20) Start of PhD
(13:00) AI and the End of Scarcity
(18:12) Malicious Uses of AI
(28:00) GPT-2 and Publication Norms
(33:30) AI-Generated Text for Misinformation
(37:05) State of AI Misinformation
(41:30) Trustworthy AI
(48:50) OpenAI Policy Research Team
Miles is a researcher and research manager, and is passionate about the responsible governance of artificial intelligence. In 2018, he joined OpenAI, where he began as a Research Scientist and recently became Head of Policy Research. Before that, he was a Research Fellow at the University of Oxford's Future of Humanity Institute, where he is still a Research Affiliate).He also serves as a member of Axon's AI and Policing Technology Ethics Board. He completed a PhD in Human and Social Dimensions of Science and Technology from Arizona State University in 2019.
Podcast Theme: “MusicVAE: Trio 16-bar Sample #2” from "MusicVAE: A Hierarchical Latent Vector Model for Learning Long-Term Structure in Music"
Hosted by Andrey Kurenkov (@andrey_kurenkov), a PhD student with the Stanford Vision and Learning Lab working on learning techniques for robotic manipulation and search.