Player FM - Internet Radio Done Right
Checked 6M ago
dört yıl önce eklendi
İçerik The Thesis Review and Sean Welleck tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan The Thesis Review and Sean Welleck veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.
Player FM - Podcast Uygulaması
Player FM uygulamasıyla çevrimdışı Player FM !
Player FM uygulamasıyla çevrimdışı Player FM !
Dinlemeye Değer Podcast'ler
SPONSOR
T
This Is Woman's Work with Nicole Kalil


1 How To Pitch Yourself (And Get A Yes) | 300 27:52
27:52
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi27:52
We made it— 300 episodes of This Is Woman’s Work ! And we’re marking this milestone by giving you something that could seriously change the game in your business or career: the skill of pitching yourself effectively. Whether you’re dreaming of being a podcast guest, landing a speaking gig, signing a client, or just asking for what you want with confidence—you’re already pitching yourself, every day. But are you doing it well? In this milestone episode, Nicole breaks down exactly how to pitch yourself to be a podcast guest … and actually hear “yes.” With hundreds of pitches landing in her inbox each month, she shares what makes a guest stand out (or get deleted), the biggest mistakes people make, and why podcast guesting is still one of the most powerful ways to grow your reach, authority, and influence. In This Episode, We Cover: ✅ Why we all need to pitch ourselves—and how to do it without feeling gross ✅ The step-by-step process for landing guest spots on podcasts (and more) ✅ A breakdown of the 3 podcast levels: Practice, Peer, and A-List—and how to approach each ✅ The must-haves of a successful podcast pitch (including real examples) ✅ How to craft a pitch that gets read, gets remembered, and gets results Whether you’re new to pitching or want to level up your game, this episode gives you the exact strategy Nicole and her team use to land guest spots on dozens of podcasts every year. Because your voice deserves to be heard. And the world needs what only you can bring. 🎁 Get the FREE Podcast Pitch Checklist + Additional Information on your Practice Group, Peer Group, and A-List Group Strategies: https://nicolekalil.com/podcast 📥 Download The Podcast Pitch Checklist Here Related Podcast Episodes: Shameless and Strategic: How to Brag About Yourself with Tiffany Houser | 298 How To Write & Publish A Book with Michelle Savage | 279 How To Land Your TED Talk and Skyrocket Your Personal Brand with Ashley Stahl | 250 Share the Love: If you found this episode insightful, please share it with a friend, tag us on social media, and leave a review on your favorite podcast platform! 🔗 Subscribe & Review: Apple Podcasts | Spotify | Amazon Music…
[13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling
Manage episode 302418432 series 2982803
İçerik The Thesis Review and Sean Welleck tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan The Thesis Review and Sean Welleck veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.
Adji Bousso Dieng is currently a Research Scientist at Google AI, and will be starting as an assistant professor at Princeton University in 2021. Her research focuses on combining probabilistic graphical modeling and deep learning to design models for structured high-dimensional data. Her PhD thesis is titled "Deep Probabilistic Graphical Modeling", which she completed in 2020 at Columbia University. We discuss her work on combining graphical models and deep learning, including models and algorithms, the value of interpretability and probabilistic models, as well as applications and making an impact through research. Episode notes: https://cs.nyu.edu/~welleck/episode13.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview
…
continue reading
49 bölüm
Manage episode 302418432 series 2982803
İçerik The Thesis Review and Sean Welleck tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan The Thesis Review and Sean Welleck veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.
Adji Bousso Dieng is currently a Research Scientist at Google AI, and will be starting as an assistant professor at Princeton University in 2021. Her research focuses on combining probabilistic graphical modeling and deep learning to design models for structured high-dimensional data. Her PhD thesis is titled "Deep Probabilistic Graphical Modeling", which she completed in 2020 at Columbia University. We discuss her work on combining graphical models and deep learning, including models and algorithms, the value of interpretability and probabilistic models, as well as applications and making an impact through research. Episode notes: https://cs.nyu.edu/~welleck/episode13.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview
…
continue reading
49 bölüm
Tüm bölümler
×
1 [48] Tianqi Chen - Scalable and Intelligent Learning Systems 46:29
46:29
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi46:29
Tianqi Chen is an Assistant Professor in the Machine Learning Department and Computer Science Department at Carnegie Mellon University and the Chief Technologist of OctoML. His research focuses on the intersection of machine learning and systems. Tianqi's PhD thesis is titled "Scalable and Intelligent Learning Systems," which he completed in 2019 at the University of Washington. We discuss his influential work on machine learning systems, starting with the development of XGBoost,an optimized distributed gradient boosting library that has had an enormous impact in the field. We also cover his contributions to deep learning frameworks like MXNet and machine learning compilation with TVM, and connect these to modern generative AI. - Episode notes: www.wellecks.com/thesisreview/episode48.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Follow Tianqi Chen on Twitter (@tqchenml) - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [47] Niloofar Mireshghallah - Auditing and Mitigating Safety Risks in Large Language Models 1:17:06
1:17:06
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:17:06
Niloofar Mireshghallah is a postdoctoral scholar at the University of Washington. Her research focuses on privacy, natural language processing, and the societal implications of machine learning. Niloofar completed her PhD in 2023 at UC San Diego, where she was advised by Taylor Berg-Kirkpatrick. Her PhD thesis is titled "Auditing and Mitigating Safety Risks in Large Language Models." We discuss her journey into research and her work on privacy and LLMs, including how privacy is defined, common attacks and mitigations, differential privacy, and the balance between memorization and generalization. - Episode notes: www.wellecks.com/thesisreview/episode47.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [46] Yulia Tsvetkov - Linguistic Knowledge in Data-Driven NLP 59:53
59:53
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi59:53
Yulia Tsvetkov is a Professor in the Allen School of Computer Science & Engineering at the University of Washington. Her research focuses on multilingual NLP, NLP for social good, and language generation. Yulia's PhD thesis is titled "Linguistic Knowledge in Data-Driven Natural Language Processing", which she completed in 2016 at CMU. We discuss getting started in research, then move to Yulia's work in the thesis that combines ideas from linguistics and natural language processing. We discuss low-resource and multilingual NLP, large language models, and great advice about research and beyond. - Episode notes: www.wellecks.com/thesisreview/episode46.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [45] Luke Zettlemoyer - Learning to Map Sentences to Logical Form 59:35
59:35
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi59:35
Luke Zettlemoyer is a Professor at the University of Washington and Research Scientist at Meta. His work spans machine learning and NLP, including foundational work in large-scale self-supervised pretraining of language models. Luke's PhD thesis is titled "Learning to Map Sentences to Logical Form", which he completed in 2009 at MIT. We talk about his PhD work, the path to the foundational Elmo paper, and various topics related to large language models. - Episode notes: www.wellecks.com/thesisreview/episode45.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [44] Hady Elsahar - NLG from Structured Knowledge Bases (& Controlling LMs) 1:05:56
1:05:56
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:05:56
Hady Elsahar is a Research Scientist at Naver Labs Europe. His research focuses on Neural Language Generation under constrained and controlled conditions. Hady's PhD was on interactions between Natural Language and Structured Knowledge bases for Data2Text Generation and Relation Extraction & Discovery, which he completed in 2019 at the Université de Lyon. We talk about his phd work and how it led to interests in multilingual and low-resource in NLP, as well as controlled generation. We dive deeper in controlling language models, including his interesting work on distributional control and energy-based models. - Episode notes: www.wellecks.com/thesisreview/episode44.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [43] Swarat Chaudhuri - Logics and Algorithms for Software Model Checking 1:06:18
1:06:18
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:06:18
Swarat Chaudhuri is an Associate Professor at the University of Texas. His lab studies problems at the interface of programming languages, logic and formal methods, and machine learning. Swarat's PhD thesis is titled "Logics and Algorithms for Software Model Checking", which he completed in 2007 at the University of Pennsylvania. We discuss reasoning about programs, formal methods & safer machine learning systems, and the future of program synthesis & neurosymbolic programming. - Episode notes: www.wellecks.com/thesisreview/episode43.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [42] Charles Sutton - Efficient Training Methods for Conditional Random Fields 1:18:01
1:18:01
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:18:01
Charles Sutton is a Research Scientist at Google Brain and an Associate Professor at the University of Edinburgh. His research focuses on deep learning for generating code and helping people write better programs. Charles' PhD thesis is titled "Efficient Training Methods for Conditional Random Fields", which he completed in 2008 at UMass Amherst. We start with his work in the thesis on structured models for text, and compare/contrast with today's large language models. From there, we discuss machine learning for code & the future of language models in program synthesis. - Episode notes: https://cs.nyu.edu/~welleck/episode42.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [41] Talia Ringer - Proof Repair 1:19:02
1:19:02
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:19:02
Talia Ringer is an Assistant Professor with the Programming Languages, Formal Methods, and Software Engineering group at University of Illinois Urbana-Champaign. Her research focuses on formal verification and proof engineering technologies. Talia's PhD thesis is titled "Proof Repair", which she completed in 2021 at the University of Washington. We discuss software verification and her PhD work on proof repair for maintaining verified systems, and discuss the intersection of machine learning with her work. - Episode notes: https://cs.nyu.edu/~welleck/episode41.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [40] Lisa Lee - Learning Embodied Agents with Scalably-Supervised RL 46:59
46:59
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi46:59
Lisa Lee is a Research Scientist at Google Brain. Her research focuses on building AI agents that can learn and adapt like humans and animals do. Lisa's PhD thesis is titled "Learning Embodied Agents with Scalably-Supervised Reinforcement Learning", which she completed in 2021 at Carnegie Mellon University. We talk about her work in the thesis on reinforcement learning, including exploration, learning with weak supervision, and embodied agents, and cover various topics related to trends in reinforcement learning. - Episode notes: https://cs.nyu.edu/~welleck/episode40.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [39] Burr Settles - Curious Machines: Active Learning with Structured Instances 1:06:33
1:06:33
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:06:33
Burr Settles leads the research group at Duolingo, a language-learning website and mobile app whose mission is to make language education free and accessible to everyone. Burr’s PhD thesis is titled "Curious Machines: Active Learning with Structured Instances", which he completed in 2008 at the University of Wisconsin-Madison. We talk about his work in the thesis on active learning, then chart the path to Burr’s role at DuoLingo. We discuss machine learning for education and language learning, including content, assessment, and the exciting possibilities opened by recent advancements. - Episode notes: https://cs.nyu.edu/~welleck/episode39.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations 1:04:47
1:04:47
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:04:47
Andrew Lampinen is a research scientist at DeepMind. His research focuses on cognitive flexibility and generalization. Andrew’s PhD thesis is titled "A Computational Framework for Learning and Transforming Task Representations", which he completed in 2020 at Stanford University. We talk about cognitive flexibility in brains and machines, centered around his work in the thesis on meta-mapping. We cover a lot of interesting ground, including complementary learning systems and memory, compositionality and systematicity, and the role of symbols in machine learning. - Episode notes: https://cs.nyu.edu/~welleck/episode38.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [37] Joonkoo Park - Neural Substrates of Visual Word and Number Processing 1:09:28
1:09:28
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:09:28
Joonkoo Park is an Associate Professor and Honors Faculty in the Department of Psychological and Brain Sciences at UMass Amherst. He leads the Cognitive and Developmental Neuroscience Lab, focusing on understanding the developmental mechanisms and neurocognitive underpinnings of our knowledge about number and mathematics. Joonkoo’s PhD thesis is titled "Experiential Effects on the Neural Substrates of Visual Word and Number Processing", which he completed in 2011 at the University of Michigan. We talk about numerical processing in the brain, starting with nature vs. nurture, including the learned versus built-in aspects of neural architectures. We talk about the difference between word and number processing, types of numerical thinking, and symbolic vs. non-symbolic numerical processing. - Episode notes: https://cs.nyu.edu/~welleck/episode37.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [36] Dieuwke Hupkes - Hierarchy and Interpretability in Neural Models of Language Processing 1:02:26
1:02:26
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:02:26
Dieuwke Hupkes is a Research Scientist at Facebook AI Research and the scientific manager of the Amsterdam unit of ELLIS. Dieuwke's PhD thesis is titled, "Hierarchy and Interpretability in Neural Models of Language Processing", which she completed in 2020 at the University of Amsterdam. We discuss her work on which aspects of hierarchical compositionality and syntactic structure can be learned by recurrent neural networks, how these models can serve as explanatory models of human language processing, what compositionality actually means, and a lot more. - Episode notes: https://cs.nyu.edu/~welleck/episode36.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [35] Armando Solar-Lezama - Program Synthesis by Sketching 1:15:56
1:15:56
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:15:56
Armando Solar-Lezama is a Professor at MIT, and the Associate Director & COO of CSAIL. He leads the Computer Assisted Programming Group, focused on program synthesis. Armando’s PhD thesis is titled, "Program Synthesis by Sketching", which he completed in 2008 at UC Berkeley. We talk about program synthesis & his work on Sketch, how machine learning's role in program synthesis has evolved over time, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode35.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…

1 [34] Sasha Rush - Lagrangian Relaxation for Natural Language Decoding 1:08:12
1:08:12
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:08:12
Sasha Rush is an Associate Professor at Cornell Tech and researcher at Hugging Face. His research focuses on building NLP systems that are safe, fast, and controllable. Sasha's PhD thesis is titled, "Lagrangian Relaxation for Natural Language Decoding", which he completed in 2014 at MIT. We talk about his work in the thesis on decoding in NLP, how it connects with today, and many interesting topics along the way such as the role of engineering in machine learning, breadth vs. depth, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode34.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [33] Michael R. Douglas - G/H Conformal Field Theory 1:12:58
1:12:58
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:12:58
Michael R. Douglas is a theoretical physicist and Professor at Stony Brook University, and Visiting Scholar at Harvard University. His research focuses on string theory, theoretical physics and its relations to mathematics. Michael's PhD thesis is titled, "G/H Conformal Field Theory", which he completed in 1988 at Caltech. We talk about working with Feynman, Sussman, and Hopfield during his PhD days, the superstring revolutions and string theory, and machine learning's role in the future of science and mathematics. - Episode notes: https://cs.nyu.edu/~welleck/episode33.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [32] Andre Martins - The Geometry of Constrained Structured Prediction 1:26:39
1:26:39
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:26:39
Andre Martins is an Associate Professor at IST and VP of AI Research at Unbabel in Lisbon, Portugal. His research focuses on natural language processing and machine learning. Andre’s PhD thesis is titled, "The Geometry of Constrained Structured Prediction: Applications to Inference and Learning of Natural Language Syntax", which he completed in 2012 at Carnegie Mellon University and IST. We talk about his work in the thesis on structured prediction in NLP, and discuss connections between his thesis work on later work on sparsity, sparse communication, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode32.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [31] Jay McClelland - Preliminary Letter Identification in the Perception of Words and Nonwords 1:33:51
1:33:51
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:33:51
Jay McClelland is a Professor in the Psychology Department and Director of the Center for Mind, Brain, Computation and Technology at Stanford. His research addresses a broad range of topics in cognitive science and cognitive neuroscience, including Parallel Distributed Processing (PDP). Jay's PhD thesis is titled "Preliminary Letter Identification in the Perception of Words and Nonwords", which he completed in 1975 at University of Pennsylvania. We discuss his work in the thesis on the word superiority effect, how it led to the Integrated Activation model, the path to Parallel Distributed Processing and the connectionist revolution, and distributed vs rule-based and symbolic approaches to modeling human cognition and artificial intelligence. - Episode notes: https://cs.nyu.edu/~welleck/episode31.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [30] Dustin Tran - Probabilistic Programming for Deep Learning 1:02:50
1:02:50
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:02:50
Dustin Tran is a research scientist at Google Brain. His research focuses on advancing science and intelligence, including areas involving probability, programs, and neural networks. Dustin’s PhD thesis is titled "Probabilistic Programming for Deep Learning", which he completed in 2020 at Columbia University. We discuss the intersection of probabilistic modeling and deep learning, including the Edward library and the novel inference algorithms and models that he developed in the thesis. - Episode notes: https://cs.nyu.edu/~welleck/episode30.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [29] Tengyu Ma - Non-convex Optimization for Machine Learning 1:17:22
1:17:22
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:17:22
Tengyu Ma is an Assistant Professor at Stanford University. His research focuses on deep learning and its theory, as well as various topics in machine learning. Tengyu's PhD thesis is titled "Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding", which he completed in 2017 at Princeton University. We discuss theory in machine learning and deep learning, including the 'all local minima are global minima' property, overparameterization, as well as perspectives that theory takes on understanding deep learning. - Episode notes: https://cs.nyu.edu/~welleck/episode29.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [28] Karen Ullrich - A Coding Perspective on Deep Latent Variable Models 1:06:20
1:06:20
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:06:20
Karen Ullrich is a Research Scientist at FAIR. Her research focuses on the intersection of information theory and probabilistic machine learning and deep learning. Karen's PhD thesis is titled "A coding perspective on deep latent variable models", which she completed in 2020 at The University of Amsterdam. We discuss information theory & the minimum description length principle, along with her work in the thesis on compression and communication. Episode notes: https://cs.nyu.edu/~welleck/episode28.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [27] Danqi Chen - Neural Reading Comprehension and Beyond 55:43
55:43
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi55:43
Danqi Chen is an assistant professor at Princeton University, co-leading the Princeton NLP Group. Her research focuses on fundamental methods for learning representations of language and knowledge, and practical systems including question answering, information extraction and conversational agents. Danqi’s PhD thesis is titled "Neural Reading Comprehension and Beyond", which she completed in 2018 at Stanford University. We discuss her work on parsing, reading comprehension and question answering. Throughout we discuss progress in NLP, fundamental challenges, and what the future holds. Episode notes: https://cs.nyu.edu/~welleck/episode27.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [26] Kevin Ellis - Algorithms for Learning to Induce Programs 1:17:49
1:17:49
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:17:49
Kevin Ellis is an assistant professor at Cornell and currently a research scientist at Common Sense Machines. His research focuses on artificial intelligence, program synthesis, and neurosymbolic models. Kevin's PhD thesis is titled "Algorithms for Learning to Induce Programs", which he completed in 2020 at MIT. We discuss Kevin’s work at the intersection of machine learning and program induction, including inferring graphics programs from images and drawings, DreamCoder, and more. Episode notes: https://cs.nyu.edu/~welleck/episode26.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [25] Tomas Mikolov - Statistical Language Models Based on Neural Networks 1:19:17
1:19:17
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:19:17
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [24] Martin Arjovsky - Out of Distribution Generalization in Machine Learning 1:02:48
1:02:48
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:02:48
Martin Arjovsky is a postdoctoral researcher at INRIA. His research focuses on generative modeling, generalization, and exploration in RL. Martin's PhD thesis is titled "Out of Distribution Generalization in Machine Learning", which he completed in 2019 at New York University. We discuss his work on the influential Wasserstein GAN early in his PhD, then discuss his thesis work on out-of-distribution generalization which focused on causal invariance and invariant risk minimization. Episode notes: https://cs.nyu.edu/~welleck/episode24.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [23] Simon Du - Gradient Descent for Non-convex Problems in Modern Machine Learning 1:06:30
1:06:30
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:06:30
Simon Shaolei Du is an Assistant Professor at the University of Washington. His research focuses on theoretical foundations of deep learning, representation learning, and reinforcement learning. Simon's PhD thesis is titled "Gradient Descent for Non-convex Problems in Modern Machine Learning", which he completed in 2019 at Carnegie Mellon University. We discuss his work related to the theory of gradient descent for challenging non-convex problems that we encounter in deep learning. We cover various topics including connections with the Neural Tangent Kernel, theory vs. practice, and future research directions. Episode notes: https://cs.nyu.edu/~welleck/episode23.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [22] Graham Neubig - Unsupervised Learning of Lexical Information 1:02:30
1:02:30
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:02:30
Graham Neubig is an Associate Professor at Carnegie Mellon University. His research focuses on language and its role in human communication, with the goal of breaking down barriers in human-human or human-machine communication through the development of NLP technologies. Graham’s PhD thesis is titled "Unsupervised Learning of Lexical Information for Language Processing Systems", which he completed in 2012 at Kyoto University. We discuss his PhD work related to the fundamental processing units that NLP systems use to process text, including non-parametric Bayesian models, segmentation, and alignment problems, and discuss how his perspective on machine translation has evolved over time. Episode notes: http://cs.nyu.edu/~welleck/episode22.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at http://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [21] Michela Paganini - Machine Learning Solutions for High Energy Physics 1:07:43
1:07:43
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:07:43
Michela Paganini is a Research Scientist at DeepMind. Her research focuses on investigating ways to compress and scale up neural networks. Michela's PhD thesis is titled "Machine Learning Solutions for High Energy Physics", which she completed in 2019 at Yale University. We discuss her PhD work on deep learning for high energy physics, including jet tagging and fast simulation for the ATLAS experiment at the Large Hadron Collider, and the intersection of machine learning and physics. Episode notes: https://cs.nyu.edu/~welleck/episode21.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [20] Josef Urban - Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics 1:25:18
1:25:18
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:25:18
Josef Urban is a Principal Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research focuses on artificial intelligence for large-scale computer-assisted reasoning. Josef's PhD thesis is titled "Exploring and Combining Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics", which he completed in 2004 at Charles University in Prague. We discuss his PhD work on the Mizar Problems for Theorem Proving, machine learning for premise selection, and how it evolved into his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode20.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [19] Dumitru Erhan - Understanding Deep Architectures and the Effect of Unsupervised Pretraining 1:20:03
1:20:03
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:20:03
Dumitru Erhan is a Research Scientist at Google Brain. His research focuses on understanding the world with neural networks. Dumitru's PhD thesis is titled "Understanding Deep Architectures and the Effect of Unsupervised Pretraining", which he completed in 2010 at the University of Montreal. We discuss his work in the thesis on understanding deep networks and unsupervised pretraining, his perspective on deep learning's development, and the path of ideas to his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode19.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [18] Eero Simoncelli - Distributed Representation and Analysis of Visual Motion 1:25:37
1:25:37
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:25:37
Eero Simoncelli is a Professor of Neural Science, Mathematics, Data Science, and Psychology at New York University. His research focuses on representation and analysis of visual information. Eero's PhD thesis is titled "Distributed Representation & Analysis of Visual Motion", which he completed in 1993 at MIT. We discuss his PhD work which focused on optical flow, which ideas and methods have stayed with him throughout his career, making biological connections with machine learning models, and how Eero's perspective of vision has evolved. Episode notes: https://cs.nyu.edu/~welleck/episode18.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [17] Paul Middlebrooks - Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex 1:36:10
1:36:10
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:36:10
Paul Middlebrooks is a neuroscientist and host of the Brain Inspired podcast, which explores the intersection of neuroscience and artificial intelligence. Paul's PhD thesis is titled "Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex", which he completed at the University of Pittsburgh in 2011. We discuss Paul's work on meta-cognition - informally, thinking about thinking - then discuss neuroscience for A.I. and A.I. for neuroscience. Episode notes: https://cs.nyu.edu/~welleck/episode17.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.patreon.com/thesisreview…
T
The Thesis Review

1 [16] Aaron Courville - A Latent Cause Theory of Classical Conditioning 1:19:21
1:19:21
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:19:21
Aaron Courville is a Professor at the University of Montreal. His research focuses on the development of deep learning models and methods. Aaron's PhD thesis is titled "A Latent Cause Theory of Classical Conditioning", which he completed at Carnegie Mellon University in 2006. We discuss Aaron's work on the latent cause theory during his PhD, talk about how Aaron moved into machine learning and deep learning research, chart a path to today's deep learning methods, and discuss his recent work on systematic generalization in language. Episode notes: cs.nyu.edu/~welleck/episode16.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [15] Christian Szegedy - Some Applications of the Weighted Combinatorial Laplacian 1:06:52
1:06:52
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:06:52
Christian Szegedy is a Research Scientist at Google. His research machine learning methods such as the inception architecture, batch normalization and adversarial examples, and he currently investigates machine learning for mathematical reasoning. Christian’s PhD thesis is titled "Some Applications of the Weighted Combinatorial Laplacian" which he completed in 2005 at the University of Bonn. We discuss Christian’s background in mathematics, his PhD work on areas of both pure and applied mathematics, and his path into machine learning research. Finally, we discuss his recent work with using deep learning for mathematical reasoning and automatically formalizing mathematics. Episode notes: https://cs.nyu.edu/~welleck/episode15.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [14] Been Kim - Interactive and Interpretable Machine Learning Models 1:04:22
1:04:22
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:04:22
Been Kim is a Research Scientist at Google Brain. Her research focuses on designing high-performance machine learning methods that make sense to humans. Been's PhD thesis is titled "Interactive and Interpretable Machine Learning Models for Human Machine Collaboration", which she completed in 2015 at MIT. We discuss her work on interpretability, including her work in the thesis on the Bayesian Case Model and its interactive version, as well as connections with her subsequent work on black-box interpretability methods that are used in many real-world applications. Episode notes: https://cs.nyu.edu/~welleck/episode14.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling 1:07:50
1:07:50
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:07:50
Adji Bousso Dieng is currently a Research Scientist at Google AI, and will be starting as an assistant professor at Princeton University in 2021. Her research focuses on combining probabilistic graphical modeling and deep learning to design models for structured high-dimensional data. Her PhD thesis is titled "Deep Probabilistic Graphical Modeling", which she completed in 2020 at Columbia University. We discuss her work on combining graphical models and deep learning, including models and algorithms, the value of interpretability and probabilistic models, as well as applications and making an impact through research. Episode notes: https://cs.nyu.edu/~welleck/episode13.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [12] Martha White - Regularized Factor Models 1:08:35
1:08:35
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:08:35
Martha White is an Associate Professor at the University of Alberta. Her research focuses on developing reinforcement learning and representation learning techniques for adaptive, autonomous agents learning on streams of data. Her PhD thesis is titled "Regularized Factor Models", which she completed in 2014 at the University of Alberta. We discuss the regularized factor model framework, which unifies many machine learning methods and led to new algorithms and applications. We talk about sparsity and how it also appears in her later work, as well as the common threads between her thesis work and her research in reinforcement learning. Episode notes: https://cs.nyu.edu/~welleck/episode12.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [11] Jacob Andreas - Learning from Language 1:19:43
1:19:43
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:19:43
Jacob Andreas is an Assistant Professor at MIT, where he leads the language and intelligence group, focusing on language as a communicative and computational tool. His PhD thesis is titled "Learning from Language" which he completed in 2018 at UC Berkeley. We discuss compositionality and neural module networks, the intersection of RL and language, and translating a neural communication channel called 'neuralese', and how this can lead to more interpretable machine learning models. Episode notes: https://cs.nyu.edu/~welleck/episode11.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [10] Chelsea Finn - Learning to Learn with Gradients 51:44
51:44
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi51:44
Chelsea Finn is an Assistant Professor at Stanford University, where she leads the IRIS lab that studies intelligence through robotic interaction at scale. Her PhD thesis is titled "Learning to Learn with Gradients", which she completed in 2018 at UC Berkeley. Chelsea received the prestigious ACM Doctoral Dissertation Award for her work in the thesis. We discuss machine learning for robotics, focusing on learning-to-learn - also known as meta-learning - and her work on the MAML algorithm during her PhD, as well as the future of robotics research. Episode notes: https://cs.nyu.edu/~welleck/episode10.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [09] Kenneth Stanley - Efficient Evolution of Neural Networks through Complexification 1:21:26
1:21:26
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:21:26
Kenneth Stanley is a researcher at OpenAI, where he leads the team on Open-endedness. Previously he was a Professor Computer Science at the University of Central Florida, cofounder of Geometric Intelligence, and head of Core AI research at Uber AI labs. His PhD thesis is titled "Efficient Evolution of Neural Networks through Complexification", which he completed on 2004 at the University of Texas. We talk about evolving increasingly complex structures and how this led to the NEAT algorithm that he developed during his PhD. We discuss his research directions related to open-endedness, how the field has changed over time, and how he currently views algorithms that were developed over a decade ago. Episode notes: https://cs.nyu.edu/~welleck/episode9.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [08] He He - Sequential Decisions and Predictions in NLP 1:00:39
1:00:39
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:00:39
He He is an Assistant Professor at New York University. Her research focuses on enabling reliable communication in natural language between machine and humans, including topics in text generation, robust language understanding, and dialogue systems. Her PhD thesis is titled "Sequential Decisions and Predictions in NLP", which she completed in 2016 at the University of Maryland. We talk about the intersection of language with imitation learning and reinforcement learning, her work in the thesis on opponent modeling and simultaneous translation, and how it relates to recent work on generation and robustness. Episode notes: https://cs.nyu.edu/~welleck/episode8.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [07] John Schulman - Optimizing Expectations: From Deep RL to Stochastic Computation Graphs 1:04:28
1:04:28
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:04:28
John Schulman is a Research Scientist and co-founder of Open AI. John co-leads the reinforcement learning team, researching algorithms that safely and efficiently learn by trial and error and by imitating humans. His PhD thesis is titled "Optimizing Expectations: From Deep Reinforcement Learning to Stochastic Computation Graphs", which he completed in 2016 at Berkeley. We talk about his work on stochastic computation graphs and TRPO, how it evolved to PPO and how it's used in large-scale applications like Open AI Five, as well as his recent work on generalization in RL. Episode notes: https://cs.nyu.edu/~welleck/episode7.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [06] Yoon Kim - Deep Latent Variable Models of Natural Language 1:05:50
1:05:50
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:05:50
Yoon Kim is currently a Research Scientist at the MIT-IBM AI Watson Lab, and will be joining MIT as an assistant professor in 2021. Yoon’s research focuses on machine learning and natural language processing. His PhD thesis is titled "Deep Latent Variable Models of Natural Language", which he completed in 2020 at Harvard University. We discuss his work on uncovering latent structure in natural language, including continuous vector representations, tree structures, and grammars. We cover learning and variational inference methods that he developed during his PhD, and he offers a look at where latent variable models will be heading in the future. Episode notes: https://cs.nyu.edu/~welleck/episode6.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [05] Julian Togelius - Computational Intelligence and Games 1:12:06
1:12:06
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:12:06
Julian Togelius is an Associate Professor at New York University, where he co-directs the NYU Game Innovation Lab. His research is at the intersection of computational intelligence and computer games. His PhD thesis is titled "Optimization, Imitation, and Innovation: Computational Intelligence and Games", which he completed in 2007. We cover his work in the thesis on AI for games and games for AI, and how it connects to his recent work on procedural content generation. Episode notes: https://cs.nyu.edu/~welleck/episode5.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [04] Sebastian Nowozin - Learning with Structured Data: Applications to Computer Vision 1:44:32
1:44:32
Daha Sonra Çal
Daha Sonra Çal
Listeler
Beğen
Beğenildi1:44:32
Sebastian Nowozin is currently a Researcher at Microsoft Research Cambridge. His research focuses on probabilistic deep learning, consequences of model misspecification, understanding agent complexity in order to improve learning efficiency, and designing models for reasoning and planning. His PhD thesis is titled "Learning with Structured Data: Applications to Computer Vision", which he completed in 2009. We discuss the work in his thesis on structured inputs and structured outputs, which involves beautiful ideas from polyhedral combinatorics and optimization. We talk about his recent work on Bayesian deep learning and the connections it has to ideas that he explored during his PhD. Episode notes: https://cs.nyu.edu/~welleck/episode4.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
Player FM'e Hoş Geldiniz!
Player FM şu anda sizin için internetteki yüksek kalitedeki podcast'leri arıyor. En iyi podcast uygulaması ve Android, iPhone ve internet üzerinde çalışıyor. Aboneliklerinizi cihazlar arasında eş zamanlamak için üye olun.