Foundational Models are the Future but... with Alex Ratner CEO of Snorkel AI // MLOps Podcast #139
Manage episode 351407511 series 3241972
MLOps Coffee Sessions #139 with Alex Ratner, Putting Foundation Models to Use for the Enterprise co-hosted by Abi Aryan sponsored by Snorkel AI.
Foundation models are rightfully being compared to other game-changing industrial advances like steam engines or electric motors. They’re core to the transition of AI from a bespoke, less predictable science to an industrialized, democratized practice. Before they can achieve this impact, however, we need to bridge the cost, quality, and control gaps.
Snorkel Flow Foundation Model Suite is the fastest way for AI/ML teams to put foundation models to use. For some projects, this means fine-tuning a foundation model for production dramatically faster by creating programmatically labeling training data. For others, the optimal solution will be using Snorkel Flow’s distill, combine, and correct approach to extract the most relevant knowledge from foundation models and encode that value into the right-sized models for your use case.
AI/ML teams can determine which Foundation Model Suite capabilities to use (and in what combination) to optimize for cost, quality, and control using Snorkel Flow’s integrated workflow for programmatic labeling, model training, and rapid-guided iteration.
Alex Ratner is the Co-founder and CEO of Snorkel AI and an Assistant Professor of Computer Science at the University of Washington.
Prior to Snorkel AI and UW, he completed his Ph.D. in CS advised by Christopher Ré at Stanford, where he started and led the Snorkel open-source project, and where his research focused on applying data management and statistical learning techniques to emerging machine learning workflows such as creating and managing training data and applying this to real-world problems in medicine, knowledge base construction, and more. Previously, he earned his A.B. in Physics from Harvard University.
// MLOps Jobs board
// MLOps Swag/Merch
// Related Links
Huge “foundation models” are turbo-charging AI progress: https://www.economist.com/interactive/briefing/2022/06/11/huge-foundation-models-are-turbo-charging-ai-progress
Nemo: Guiding and Contextualizing Weak Supervision for Interactive Data Programming: https://arxiv.org/abs/2203.01382
The Principles of Data-Centric AI Development: https://snorkel.ai/principles-of-data-centric-ai-development/
--------------- ✌️Connect With Us ✌️ -------------
Join our slack community: https://go.mlops.community/slack
Follow us on Twitter: @mlopscommunity
Sign up for the next meetup: https://go.mlops.community/register
Catch all episodes, blogs, newsletters, and more: https://mlops.community/
Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/
Connect with Abi on LinkedIn: https://www.linkedin.com/in/abiaryan/
Connect with Alex on LinkedIn: https://www.linkedin.com/in/alexander-ratner-038ba239/
Timestamps: [00:00] Alex's preferred coffee [01:20] Introduction to Alex Ratner [02:34] Takeaways [04:04] Huge shoutout to our Sponsor, Snorkel AI! [04:39] Comment, rate us, and share our podcasts with your friends! [04:50] Transfer Learning / Active Learning [11:30] Labeling Heuristics paper on Nemo [18:14] Geocentric AI [21:48] Enterprise use cases on Foundational Models [32:45] Foundational Models into the different Google products [38:36] Progress in Foundational Models [43:55] AutoML Models Baseline Accuracy [44:40] Hosting Infrastructure Snorkel Float vs GCP [46:53] Chris Re's venture capital firm / incubator / machine [51:00] Wrap