The Gradient
The Gradient: Perspectives on AI
Ted Gibson: The Structure and Purpose of Language
0:00
Current time: 0:00 / Total time: -2:13:23
-2:13:23

Ted Gibson: The Structure and Purpose of Language

On why language looks the way it does, how humans process and use language, and a perspective on LLMs from linguistics.

In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.

Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.

Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pub

Subscribe to The Gradient Podcast:  Apple Podcasts  | Spotify | Pocket Casts | RSS
Follow The Gradient on Twitter

Outline:

  • (00:00) Intro

  • (02:13) Prof Gibson’s background

  • (05:33) The computational linguistics community and NLP, engineering focus

  • (10:48) Models of brains

  • (12:03) Prof Gibson’s focus on behavioral work

  • (12:53) How dependency distances impact language processing

    • (14:03) Dependency distances and the origin of the problem

    • (18:53) Dependency locality theory

    • (21:38) The structures languages tend to use

    • (24:58) Sentence parsing: structural integrations and memory costs

    • (36:53) Reading strategies vs. ordinary language processing

    • (40:23) Legalese

    • (46:18) Cross-dependencies

  • (50:11) Number as a cognitive technology

    • (54:48) Experiments

    • (1:03:53) Why counting is useful for Western societies

    • (1:05:53) The Whorf hypothesis

  • (1:13:05) Language as Communication

    • (1:13:28) The noisy channel perspective on language processing

    • (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims

    • (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing

  • (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs

  • (2:08:48) LLM behavior and internal representations

  • (2:12:53) Outro

Links:

Discussion about this podcast

The Gradient
The Gradient: Perspectives on AI
Deeply researched, technical interviews with experts thinking about AI and technology.