Modern AI is Domestification
On prior amplification, the art of taming internet-scale data distributions to make models useful and performant for specific tasks.
As internet-scale AI models mature rapidly from coarse research demos to productionized user-facing systems, expectations have increased and goalposts have moved drastically. In just a few short months, the AI community has collectively shifted from being impressed by proof-of-concept zero-shot capabilities to tackling the challenging relative last mile of improving the quality and reliability of finetuned capabilities. As much as the community may have wished (or feared), it appears that it’s not sufficient to just dump ever larger amounts of compute, tokens, and parameters to ascend scaling curves. While this naive scaling approach can produce foundation base models with a rough understanding of the sum total of human experiences, the trillion-dollar question is how to make these base foundation models useful and performant for specific downstream capabilities. Increasingly, Modern AI is now the study of digital domestication: the art and science of taming wild internet-scale data distributions.
The Goddess Techne is amused?