Filtrer par genre
Are you a researcher or data scientist / analyst / ninja? Do you want to learn Bayesian inference, stay up to date or simply want to understand what Bayesian inference is? Then this podcast is for you! You'll hear from researchers and practitioners of all fields about how they use Bayesian statistics, and how in turn YOU can apply these methods in your modeling workflow. When I started learning Bayesian methods, I really wished there were a podcast out there that could introduce me to the methods, the projects and the people who make all that possible. So I created "Learning Bayesian Statistics", where you'll get to hear how Bayesian statistics are used to detect black matter in outer space, forecast elections or understand how diseases spread and can ultimately be stopped. But this show is not only about successes -- it's also about failures, because that's how we learn best. So you'll often hear the guests talking about what *didn't* work in their projects, why, and how they overcame these challenges. Because, in the end, we're all lifelong learners! My name is Alex Andorra by the way, and I live in Estonia. By day, I'm a data scientist and modeler at the https://www.pymc-labs.io/ (PyMC Labs) consultancy. By night, I don't (yet) fight crime, but I'm an open-source enthusiast and core contributor to the python packages https://docs.pymc.io/ (PyMC) and https://arviz-devs.github.io/arviz/ (ArviZ). I also love https://www.pollsposition.com/ (election forecasting) and, most importantly, Nutella. But I don't like talking about it – I prefer eating it. So, whether you want to learn Bayesian statistics or hear about the latest libraries, books and applications, this podcast is for you -- just subscribe! You can also support the show and https://www.patreon.com/learnbayesstats (unlock exclusive Bayesian swag on Patreon)!
- 134 - #119 Causal Inference, Fiction Writing and Career Changes, with Robert Kubinec
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Bob's research focuses on corruption and political economy.Measuring corruption is challenging due to the unobservable nature of the behavior.The challenge of studying corruption lies in obtaining honest data.Innovative survey techniques, like randomized response, can help gather sensitive data.Non-traditional backgrounds can enhance statistical research perspectives.Bayesian methods are particularly useful for estimating latent variables.Bayesian methods shine in situations with prior information.Expert surveys can help estimate uncertain outcomes effectively.Bob's novel, 'The Bayesian Hitman,' explores academia through a fictional lens.Writing fiction can enhance academic writing skills and creativity.The importance of community in statistics is emphasized, especially in the Stan community.Real-time online surveys could revolutionize data collection in social science.Chapters:
00:00 Introduction to Bayesian Statistics and Bob Kubinec
06:01 Bob's Academic Journey and Research Focus
12:40 Measuring Corruption: Challenges and Methods
18:54 Transition from Government to Academia
26:41 The Influence of Non-Traditional Backgrounds in Statistics
34:51 Bayesian Methods in Political Science Research
42:08 Bayesian Methods in COVID Measurement
51:12 The Journey of Writing a Novel
01:00:24 The Intersection of Fiction and Academia
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström and Stefan.
Links from the show:
Robert’s website (includes blog posts): https://www.robertkubinec.com/Robert on GitHub: https://github.com/saudiwinRobert on Linkedin: https://www.linkedin.com/in/robert-kubinec-9191a9a/Robert on Google Scholar: https://scholar.google.com/citations?user=bhOaXR4AAAAJ&hl=enRobert on Twitter: https://x.com/rmkubinecRobert on Bluesky: https://bsky.app/profile/rmkubinec.bsky.socialThe Bayesian Hitman: https://www.amazon.com/Bayesian-Hitman-Robert-M-Kubinec/dp/B0D6M4WNRZ/Ordbetareg overview: https://www.robertkubinec.com/ordbetaregIdealstan – this isn’t out yet, but you can access an older working paper here: https://osf.io/preprints/osf/8j2btOrdinal Regression tutorial, Michael Betancourt: https://betanalpha.github.io/assets/case_studies/ordinal_regression.htmlAndrew Heiss blog: https://www.andrewheiss.com/blog/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 13 Nov 2024 - 1h 25min - 133 - #118 Exploring the Future of Stan, with Charles Margossian & Brian Ward
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
User experience is crucial for the adoption of Stan.Recent innovations include adding tuples to the Stan language, new features and improved error messages.Tuples allow for more efficient data handling in Stan.Beginners often struggle with the compiled nature of Stan.Improving error messages is crucial for user experience.BridgeStan allows for integration with other programming languages and makes it very easy for people to use Stan models.Community engagement is vital for the development of Stan.New samplers are being developed to enhance performance.The future of Stan includes more user-friendly features.Chapters:
00:00 Introduction to the Live Episode
02:55 Meet the Stan Core Developers
05:47 Brian Ward's Journey into Bayesian Statistics
09:10 Charles Margossian's Contributions to Stan
11:49 Recent Projects and Innovations in Stan
15:07 User-Friendly Features and Enhancements
18:11 Understanding Tuples and Their Importance
21:06 Challenges for Beginners in Stan
24:08 Pedagogical Approaches to Bayesian Statistics
30:54 Optimizing Monte Carlo Estimators
32:24 Reimagining Stan's Structure
34:21 The Promise of Automatic Reparameterization
35:49 Exploring BridgeStan
40:29 The Future of Samplers in Stan
43:45 Evaluating New Algorithms
47:01 Specific Algorithms for Unique Problems
50:00 Understanding Model Performance
54:21 The Impact of Stan on Bayesian Research
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke and Robert Flannery.
Links from the show:
Come see the show live at PyData NYC: https://pydata.org/nyc2024/LBS #90, Demystifying MCMC & Variational Inference, with Charles Margossian: https://learnbayesstats.com/episode/90-demystifying-mcmc-variational-inference-charles-margossian/Charles' website: https://charlesm93.github.io/Charles on GitHub: https://github.com/charlesm93Charles on LinkedIn: https://www.linkedin.com/in/charles-margossian-3428935b/Charles on Google Scholar: https://scholar.google.com/citations?user=nPtLsvIAAAAJ&hl=enCharles on Twitter: https://x.com/charlesm993Brian's website: https://brianward.dev/Brian on GitHub: https://github.com/WardBrianBrian on LinkedIn: https://www.linkedin.com/in/ward-brianm/Brian on Google Scholar: https://scholar.google.com/citations?user=bzosqW0AAAAJ&hl=enBrian on Twitter: https://x.com/ward_brianmBob Carpenter's reflections on StanCon: https://statmodeling.stat.columbia.edu/category/bayesian-statistics/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 30 Oct 2024 - 58min - 132 - #117 Unveiling the Power of Bayesian Experimental Design, with Desi Ivanova
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Designing experiments is about optimal data gathering.The optimal design maximizes the amount of information.The best experiment reduces uncertainty the most.Computational challenges limit the feasibility of BED in practice.Amortized Bayesian inference can speed up computations.A good underlying model is crucial for effective BED.Adaptive experiments are more complex than static ones.The future of BED is promising with advancements in AI.Chapters:
00:00 Introduction to Bayesian Experimental Design
07:51 Understanding Bayesian Experimental Design
19:58 Computational Challenges in Bayesian Experimental Design
28:47 Innovations in Bayesian Experimental Design
40:43 Practical Applications of Bayesian Experimental Design
52:12 Future of Bayesian Experimental Design
01:01:17 Real-World Applications and Impact
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang and Gary Clarke.
Links from the show:
Come see the show live at PyData NYC: https://pydata.org/nyc2024/Desi’s website: https://desirivanova.com/Desi on GitHub: https://github.com/desi-ivanovaDesi on Google Scholar: https://scholar.google.com/citations?user=AmX6sMIAAAAJ&hl=enDesi on Linkedin: https://www.linkedin.com/in/dr-ivanova/Desi on Twitter: https://x.com/desirivanovaLBS #34, Multilevel Regression, Post-stratification & Missing Data, with Lauren Kennedy: https://learnbayesstats.com/episode/34-multilevel-regression-post-stratification-missing-data-lauren-kennedy/LBS #35, The Past, Present & Future of BRMS, with Paul Bürkner: https://learnbayesstats.com/episode/35-past-present-future-brms-paul-burkner/LBS #45, Biostats & Clinical Trial Design, with Frank Harrell:https://learnbayesstats.com/episode/45-biostats-clinical-trial-design-frank-harrell/LBS #107, Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt: https://learnbayesstats.com/episode/107-amortized-bayesian-inference-deep-neural-networks-marvin-schmitt/Bayesian Experimental Design (BED) with BayesFlow and PyTorch: https://github.com/stefanradev93/BayesFlow/blob/dev/examples/michaelis_menten_BED_tutorial.ipynbPaper – Modern Bayesian Experimental Design: https://arxiv.org/abs/2302.14545Paper – Optimal experimental design; Formulations and computations: https://arxiv.org/pdf/2407.16212Information theory, inference and learning algorithms, by the great late Sir David MacKay: https://www.inference.org.uk/itprnn/book.pdfPatterns, Predictions and Actions, Moritz Hard and Ben Recht https://mlstory.org/index.htmlTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 15 Oct 2024 - 1h 13min - 131 - #116 Mastering Soccer Analytics, with Ravi Ramineni
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Building an athlete management system and a scouting and recruitment platform are key goals in football analytics.The focus is on informing training decisions, preventing injuries, and making smart player signings.Avoiding false positives in player evaluations is crucial, and data analysis plays a significant role in making informed decisions.There are similarities between different football teams, and the sport has social and emotional aspects. Transitioning from on-premises SQL servers to cloud-based systems is a significant endeavor in football analytics.Analytics is a tool that aids the decision-making process and helps mitigate biases. The impact of analytics in soccer can be seen in the decline of long-range shots.Collaboration and trust between analysts and decision-makers are crucial for successful implementation of analytics.The limitations of available data in football analytics hinder the ability to directly measure decision-making on the field. Analyzing the impact of coaches in sports analytics is challenging due to the difficulty of separating their effect from other factors. Current data limitations make it hard to evaluate coaching performance accurately.Predictive metrics and modeling play a crucial role in soccer analytics, especially in predicting the career progression of young players.Improving tracking data and expanding its availability will be a significant focus in the future of soccer analytics.Chapters:
00:00 Introduction to Ravi and His Role at Seattle Sounders
06:30 Building an Analytics Department
15:00 The Impact of Analytics on Player Recruitment and Performance
28:00 Challenges and Innovations in Soccer Analytics
42:00 Player Health, Injury Prevention, and Training
55:00 The Evolution of Data-Driven Strategies
01:10:00 Future of Analytics in Sports
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang and Gary Clarke.
Links from the show:
LBS Sports Analytics playlist: https://www.youtube.com/playlist?list=PL7RjIaSLWh5kDiPVMUSyhvFaXL3NoXOe4Ravi on Linkedin: https://www.linkedin.com/in/ravi-ramineni-3798374/Ravi on Twitter: https://x.com/analyseFootyDecisions in Football - The Power of Compounding | StatsBomb Conference 2023: https://www.youtube.com/watch?v=D7CXtwDg9lMThe Signal and the Noise: https://www.amazon.com/Signal-Noise-Many-Predictions-Fail-but/dp/0143125087PreliZ – A tool-box for prior elicitation: https://preliz.readthedocs.io/en/latest/Ravi talking on Ted Knutson's podcast: https://open.spotify.com/episode/1exLBfyFf0d1dm2IaXkd2vMore about Ravi's work at the Seattle Sounders: https://www.trumedianetworks.com/expected-value-podcast/ravi-ramineniTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 02 Oct 2024 - 1h 32min - 130 - #115 Using Time Series to Estimate Uncertainty, with Nate Haines
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
State space models and traditional time series models are well-suited to forecast loss ratios in the insurance industry, although actuaries have been slow to adopt modern statistical methods.Working with limited data is a challenge, but informed priors and hierarchical models can help improve the modeling process.Bayesian model stacking allows for blending together different model predictions and taking the best of both (or all if more than 2 models) worlds.Model comparison is done using out-of-sample performance metrics, such as the expected log point-wise predictive density (ELPD). Brute leave-future-out cross-validation is often used due to the time-series nature of the data.Stacking or averaging models are trained on out-of-sample performance metrics to determine the weights for blending the predictions. Model stacking can be a powerful approach for combining predictions from candidate models. Hierarchical stacking in particular is useful when weights are assumed to vary according to covariates.BayesBlend is a Python package developed by Ledger Investing that simplifies the implementation of stacking models, including pseudo Bayesian model averaging, stacking, and hierarchical stacking.Evaluating the performance of patient time series models requires considering multiple metrics, including log likelihood-based metrics like ELPD, as well as more absolute metrics like RMSE and mean absolute error.Using robust variants of metrics like ELPD can help address issues with extreme outliers. For example, t-distribution estimators of ELPD as opposed to sample sum/mean estimators.It is important to evaluate model performance from different perspectives and consider the trade-offs between different metrics. Evaluating models based solely on traditional metrics can limit understanding and trust in the model. Consider additional factors such as interpretability, maintainability, and productionization.Simulation-based calibration (SBC) is a valuable tool for assessing parameter estimation and model correctness. It allows for the interpretation of model parameters and the identification of coding errors.In industries like insurance, where regulations may restrict model choices, classical statistical approaches still play a significant role. However, there is potential for Bayesian methods and generative AI in certain areas.Chapters:
00:00 Introduction to Bayesian Modeling in Insurance
13:00 Time Series Models and Their Applications
30:51 Bayesian Model Averaging Explained
56:20 Impact of External Factors on Forecasting
01:25:03 Future of Bayesian Modeling and AI
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Nate’s website: http://haines-lab.com/Nate on GitHub: https://github.com/Nathaniel-HainesNate on Linkedin: https://www.linkedin.com/in/nathaniel-haines-216049101/Nate on Twitter: https://x.com/nate__hainesNate on Google Scholar: https://scholar.google.com/citations?user=lg741SgAAAAJLBS #14 Hidden Markov Models & Statistical Ecology, with Vianey Leos-Barajas: https://learnbayesstats.com/episode/14-hidden-markov-models-statistical-ecology-with-vianey-leos-barajas/LBS #107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt: https://learnbayesstats.com/episode/107-amortized-bayesian-inference-deep-neural-networks-marvin-schmitt/LBS #109 Prior Sensitivity Analysis, Overfitting & Model Selection, with Sonja Winter: https://learnbayesstats.com/episode/109-prior-sensitivity-analysis-overfitting-model-selection-sonja-winter/BayesBlend – Easy Model Blending: https://arxiv.org/abs/2405.00158BayesBlend documentation: https://ledger-investing-bayesblend.readthedocs-hosted.com/en/latest/SBC paper: https://arxiv.org/abs/1804.06788Isaac Asimov’s Foundation (Hari Seldon): https://en.wikipedia.org/wiki/Hari_SeldonStancon 2023 talk on Ledger’s Bayesian modeling workflow: https://github.com/stan-dev/stancon2023/blob/main/Nathaniel-Haines/slides.pdfLedger’s Bayesian modeling workflow: https://arxiv.org/abs/2407.14666v1More on Ledger Investing: https://www.ledgerinvesting.com/about-usTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 17 Sep 2024 - 1h 39min - 129 - #114 From the Field to the Lab – A Journey in Baseball Science, with Jacob Buffa
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Education and visual communication are key in helping athletes understand the impact of nutrition on performance.Bayesian statistics are used to analyze player performance and injury risk.Integrating diverse data sources is a challenge but can provide valuable insights.Understanding the specific needs and characteristics of athletes is crucial in conditioning and injury prevention. The application of Bayesian statistics in baseball science requires experts in Bayesian methods.Traditional statistical methods taught in sports science programs are limited.Communicating complex statistical concepts, such as Bayesian analysis, to coaches and players is crucial.Conveying uncertainties and limitations of the models is essential for effective utilization.Emerging trends in baseball science include the use of biomechanical information and computer vision algorithms.Improving player performance and injury prevention are key goals for the future of baseball science.Chapters:
00:00 The Role of Nutrition and Conditioning
05:46 Analyzing Player Performance and Managing Injury Risks
12:13 Educating Athletes on Dietary Choices
18:02 Emerging Trends in Baseball Science
29:49 Hierarchical Models and Player Analysis
36:03 Challenges of Working with Limited Data
39:49 Effective Communication of Statistical Concepts
47:59 Future Trends: Biomechanical Data Analysis and Computer Vision Algorithms
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
LBS Sports Analytics playlist: https://www.youtube.com/playlist?list=PL7RjIaSLWh5kDiPVMUSyhvFaXL3NoXOe4Jacob on Linkedin: https://www.linkedin.com/in/jacob-buffa-46bb7481/Jacob on Twitter: https://x.com/EBA_BuffaThe Book – Playing The Percentages In Baseball: https://www.amazon.com/Book-Playing-Percentages-Baseball/dp/1494260174Future Value – The Battle for Baseball's Soul and How Teams Will Find the Next Superstar: https://www.amazon.com/Future-Value-Battle-Baseballs-Superstar/dp/1629377678The MVP Machine – How Baseball's New Nonconformists Are Using Data to Build Better Players: https://www.amazon.com/MVP-Machine-Baseballs-Nonconformists-Players/dp/1541698940Transcript:
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 05 Sep 2024 - 1h 01min - 128 - #113 A Deep Dive into Bayesian Stats, with Alex Andorra, ft. the Super Data Science Podcast
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Bayesian statistics is a powerful framework for handling complex problems, making use of prior knowledge, and excelling with limited data.Bayesian statistics provides a framework for updating beliefs and making predictions based on prior knowledge and observed data.Bayesian methods allow for the explicit incorporation of prior assumptions, which can provide structure and improve the reliability of the analysis.There are several Bayesian frameworks available, such as PyMC, Stan, and Bambi, each with its own strengths and features.PyMC is a powerful library for Bayesian modeling that allows for flexible and efficient computation.For beginners, it is recommended to start with introductory courses or resources that provide a step-by-step approach to learning Bayesian statistics.PyTensor leverages GPU acceleration and complex graph optimizations to improve the performance and scalability of Bayesian models.ArviZ is a library for post-modeling workflows in Bayesian statistics, providing tools for model diagnostics and result visualization.Gaussian processes are versatile non-parametric models that can be used for spatial and temporal data analysis in Bayesian statistics.Chapters:
00:00 Introduction to Bayesian Statistics
07:32 Advantages of Bayesian Methods
16:22 Incorporating Priors in Models
23:26 Modeling Causal Relationships
30:03 Introduction to PyMC, Stan, and Bambi
34:30 Choosing the Right Bayesian Framework
39:20 Getting Started with Bayesian Statistics
44:39 Understanding Bayesian Statistics and PyMC
49:01 Leveraging PyTensor for Improved Performance and Scalability
01:02:37 Exploring Post-Modeling Workflows with ArviZ
01:08:30 The Power of Gaussian Processes in Bayesian Modeling
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Original episode on the Super Data Science podcast: https://www.superdatascience.com/podcast/bayesian-methods-and-applications-with-alexandre-andorraAdvanced Regression with Bambi and PyMC: https://www.intuitivebayes.com/advanced-regressionGaussian Processes: HSGP Reference & First Steps: https://www.pymc.io/projects/examples/en/latest/gaussian_processes/HSGP-Basic.htmlModeling Webinar – Fast & Efficient Gaussian Processes: https://www.youtube.com/watch?v=9tDMouGue8gModeling spatial data with Gaussian processes in PyMC: https://www.pymc-labs.com/blog-posts/spatial-gaussian-process-01/Hierarchical Bayesian Modeling of Survey Data with Post-stratification: https://www.pymc-labs.com/blog-posts/2022-12-08-Salk/PyMC docs: https://www.pymc.io/welcome.htmlBambi docs: https://bambinos.github.io/bambi/PyMC Labs: https://www.pymc-labs.com/LBS #50 Ta(l)king Risks & Embracing Uncertainty, with David Spiegelhalter: https://learnbayesstats.com/episode/50-talking-risks-embracing-uncertainty-david-spiegelhalter/LBS #51 Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/LBS #63 Media Mix Models & Bayes for Marketing, with Luciano Paz: https://learnbayesstats.com/episode/63-media-mix-models-bayes-marketing-luciano-paz/LBS #83 Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo: https://learnbayesstats.com/episode/83-multilevel-regression-post-stratification-electoral-dynamics-tarmo-juristo/Jon Krohn on YouTube: https://www.youtube.com/JonKrohnLearnsJon Krohn on Linkedin: https://www.linkedin.com/in/jonkrohn/Jon Krohn on Twitter: https://x.com/JonKrohnLearnsTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 22 Aug 2024 - 1h 30min - 127 - #112 Advanced Bayesian Regression, with Tomi Capretto
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Teaching Bayesian Concepts Using M&Ms: Tomi Capretto uses an engaging classroom exercise involving M&Ms to teach Bayesian statistics, making abstract concepts tangible and intuitive for students.Practical Applications of Bayesian Methods: Discussion on the real-world application of Bayesian methods in projects at PyMC Labs and in university settings, emphasizing the practical impact and accessibility of Bayesian statistics.Contributions to Open-Source Software: Tomi’s involvement in developing Bambi and other open-source tools demonstrates the importance of community contributions to advancing statistical software.Challenges in Statistical Education: Tomi talks about the challenges and rewards of teaching complex statistical concepts to students who are accustomed to frequentist approaches, highlighting the shift to thinking probabilistically in Bayesian frameworks.Future of Bayesian Tools: The discussion also touches on the future enhancements for Bambi and PyMC, aiming to make these tools more robust and user-friendly for a wider audience, including those who are not professional statisticians.Chapters:
05:36 Tomi's Work and Teaching
10:28 Teaching Complex Statistical Concepts with Practical Exercises
23:17 Making Bayesian Modeling Accessible in Python
38:46 Advanced Regression with Bambi
41:14 The Power of Linear Regression
42:45 Exploring Advanced Regression Techniques
44:11 Regression Models and Dot Products
45:37 Advanced Concepts in Regression
46:36 Diagnosing and Handling Overdispersion
47:35 Parameter Identifiability and Overparameterization
50:29 Visualizations and Course Highlights
51:30 Exploring Niche and Advanced Concepts
56:56 The Power of Zero-Sum Normal
59:59 The Value of Exercises and Community
01:01:56 Optimizing Computation with Sparse Matrices
01:13:37 Avoiding MCMC and Exploring Alternatives
01:18:27 Making Connections Between Different Models
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Tomi’s website: https://tomicapretto.com/Tomi on GitHub: https://github.com/tomicaprettoTomi on Linkedin: https://www.linkedin.com/in/tom%C3%A1s-capretto-a89873106/Tomi on Twitter: https://x.com/caprettotomasAdvanced Regression course (get 10% off if you’re a Patron of the show): https://www.intuitivebayes.com/advanced-regressionBambi: https://bambinos.github.io/bambi/LBS #35 The Past, Present & Future of BRMS, with Paul Bürkner: https://learnbayesstats.com/episode/35-past-present-future-brms-paul-burkner/LBS #1 Bayes, open-source and bioinformatics, with Osvaldo Martin: https://learnbayesstats.com/episode/1-bayes-open-source-and-bioinformatics-with-osvaldo-martin/patsy - Describing statistical models in Python: https://patsy.readthedocs.io/en/latest/formulae - Formulas for mixed-models in Python: https://bambinos.github.io/formulae/Introducing Bayesian Analysis With m&m's®: An Active-Learning Exercise for Undergraduates: https://www.tandfonline.com/doi/full/10.1080/10691898.2019.1604106Richly Parameterized Linear Models Additive, Time Series, and Spatial Models Using Random Effects https://www.routledge.com/Richly-Parameterized-Linear-Models-Additive-Time-Series-and-Spatial-Models-Using-Random-Effects/Hodges/p/book/9780367533731Dan Simpson’s Blog (link to blogs with the ‘sparse matrices’ tag): https://dansblog.netlify.app/#category=Sparse%20matricesRepository for Sparse Matrix-Vector dot product: https://github.com/tomicapretto/dot_testsTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 07 Aug 2024 - 1h 27min - 126 - #111 Nerdinsights from the Football Field, with Patrick Ward
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Communicating Bayesian concepts to non-technical audiences in sports analytics can be challenging, but it is important to provide clear explanations and address limitations.Understanding the model and its assumptions is crucial for effective communication and decision-making.Involving domain experts, such as scouts and coaches, can provide valuable insights and improve the model's relevance and usefulness.Customizing the model to align with the specific needs and questions of the stakeholders is essential for successful implementation. Understanding the needs of decision-makers is crucial for effectively communicating and utilizing models in sports analytics.Predicting the impact of training loads on athletes' well-being and performance is a challenging frontier in sports analytics.Identifying discrete events in team sports data is essential for analysis and development of models.Chapters:
00:00 Bayesian Statistics in Sports Analytics
18:29 Applying Bayesian Stats in Analyzing Player Performance and Injury Risk
36:21 Challenges in Communicating Bayesian Concepts to Non-Statistical Decision-Makers
41:04 Understanding Model Behavior and Validation through Simulations
43:09 Applying Bayesian Methods in Sports Analytics
48:03 Clarifying Questions and Utilizing Frameworks
53:41 Effective Communication of Statistical Concepts
57:50 Integrating Domain Expertise with Statistical Models
01:13:43 The Importance of Good Data
01:18:11 The Future of Sports Analytics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
LBS Sports Analytics playlist: https://www.youtube.com/playlist?list=PL7RjIaSLWh5kDiPVMUSyhvFaXL3NoXOe4Patrick’s website: http://optimumsportsperformance.com/blog/Patrick on GitHub: https://github.com/pw2Patrick on Linkedin: https://www.linkedin.com/in/patrickward02/Patrick on Twitter: https://twitter.com/OSPpatrickPatrick & Ellis Screencast: https://github.com/thebioengineer/TidyXPatrick on Research Gate: https://www.researchgate.net/profile/Patrick-Ward-10Transcript:
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 24 Jul 2024 - 1h 25min - 125 - #110 Unpacking Bayesian Methods in AI with Sam Duffield
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Use mini-batch methods to efficiently process large datasets within Bayesian frameworks in enterprise AI applications.Apply approximate inference techniques, like stochastic gradient MCMC and Laplace approximation, to optimize Bayesian analysis in practical settings.Explore thermodynamic computing to significantly speed up Bayesian computations, enhancing model efficiency and scalability.Leverage the Posteriors python package for flexible and integrated Bayesian analysis in modern machine learning workflows.Overcome challenges in Bayesian inference by simplifying complex concepts for non-expert audiences, ensuring the practical application of statistical models.Address the intricacies of model assumptions and communicate effectively to non-technical stakeholders to enhance decision-making processes.Chapters:
00:00 Introduction to Large-Scale Machine Learning
11:26 Scalable and Flexible Bayesian Inference with Posteriors
25:56 The Role of Temperature in Bayesian Models
32:30 Stochastic Gradient MCMC for Large Datasets
36:12 Introducing Posteriors: Bayesian Inference in Machine Learning
41:22 Uncertainty Quantification and Improved Predictions
52:05 Supporting New Algorithms and Arbitrary Likelihoods
59:16 Thermodynamic Computing
01:06:22 Decoupling Model Specification, Data Generation, and Inference
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Sam on Twitter: https://x.com/Sam_DuffieldSam on Scholar: https://scholar.google.com/citations?user=7wm_ka8AAAAJ&hl=en&oi=ao Sam on Linkedin: https://www.linkedin.com/in/samduffield/Sam on GitHub: https://github.com/SamDuffieldPosteriors paper (new!): https://arxiv.org/abs/2406.00104 Blog post introducing Posteriors: https://blog.normalcomputing.ai/posts/introducing-posteriors/posteriors.htmlPosteriors docs: https://normal-computing.github.io/posteriors/Paper introducing Posteriors – Scalable Bayesian Learning with posteriors: https://arxiv.org/abs/2406.00104v1Normal Computing scholar: https://scholar.google.com/citations?hl=en&user=jGCLWRUAAAAJ&view_op=list_worksThermo blogs: https://blog.normalcomputing.ai/posts/2023-11-09-thermodynamic-inversion/thermo-inversion.htmlhttps://blog.normalcomputing.ai/posts/thermox/thermox.htmlGreat paper on SGMCMC: https://proceedings.neurips.cc/paper_files/paper/2015/file/9a4400501febb2a95e79248486a5f6d3-Paper.pdfDavid MacKay textbook on Sustainable Energy: https://www.withouthotair.com/LBS #107 - Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt: https://learnbayesstats.com/episode/107-amortized-bayesian-inference-deep-neural-networks-marvin-schmitt/LBS #98 - Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 10 Jul 2024 - 1h 12min - 124 - #109 Prior Sensitivity Analysis, Overfitting & Model Selection, with Sonja Winter
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways
Bayesian methods align better with researchers' intuitive understanding of research questions and provide more tools to evaluate and understand models.Prior sensitivity analysis is crucial for understanding the robustness of findings to changes in priors and helps in contextualizing research findings.Bayesian methods offer an elegant and efficient way to handle missing data in longitudinal studies, providing more flexibility and information for researchers.Fit indices in Bayesian model selection are effective in detecting underfitting but may struggle to detect overfitting, highlighting the need for caution in model complexity.Bayesian methods have the potential to revolutionize educational research by addressing the challenges of small samples, complex nesting structures, and longitudinal data. Posterior predictive checks are valuable for model evaluation and selection.Chapters
00:00 The Power and Importance of Priors
09:29 Updating Beliefs and Choosing Reasonable Priors
16:08 Assessing Robustness with Prior Sensitivity Analysis
34:53 Aligning Bayesian Methods with Researchers' Thinking
37:10 Detecting Overfitting in SEM
43:48 Evaluating Model Fit with Posterior Predictive Checks
47:44 Teaching Bayesian Methods
54:07 Future Developments in Bayesian Statistics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show
Sonja’s website: https://winterstat.github.io/Sonja on Twitter: https://twitter.com/winterstatSonja on GitHub: https://github.com/winterstatUnder-Fitting and Over-Fitting – The Performance of Bayesian Model Selection and Fit Indices in SEM: https://www.tandfonline.com/doi/full/10.1080/10705511.2023.2280952LBS #102 – Bayesian Structural Equation Modeling & Causal Inference in Psychometrics, with Ed Merkle: https://youtu.be/lXd-qstzTh4?si=jLg_qZTt1oQqRO0RLBS #107 - Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt: https://learnbayesstats.com/episode/107-amortized-bayesian-inference-deep-neural-networks-marvin-schmitt/BayesFlow tutorial: https://bayesflow.org/_examples/Intro_Amortized_Posterior_Estimation.htmlLBS #106 Active Statistics, Two Truths & a Lie, with Andrew Gelman: https://learnbayesstats.com/episode/106-active-statistics-two-truths-a-lie-andrew-gelman/LBS #61 Why we still use non-Bayesian methods, with EJ Wagenmakers: https://learnbayesstats.com/episode/61-why-we-still-use-non-bayesian-methods-ej-wagenmakers/Bayesian Workflow paper: https://arxiv.org/abs/2011.01808Michael Betancourts'blog: https://betanalpha.github.io/writing/LBS #35 The Past, Present & Future of BRMS, with Paul Bürkner: https://learnbayesstats.com/episode/35-past-present-future-brms-paul-burkner/Bayesian Model-Building Interface in Python: https://bambinos.github.io/bambi/Advanced Regression online course: https://www.intuitivebayes.com/advanced-regressionBLIMP: https://www.appliedmissingdata.com/blimpTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 25 Jun 2024 - 1h 10min - 123 - #108 Modeling Sports & Extracting Player Values, with Paul Sabin
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out hisawesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways
Convincing non-stats stakeholders in sports analytics can be challenging, but building trust and confirming their prior beliefs can help in gaining acceptance.Combining subjective beliefs with objective data in Bayesian analysis leads to more accurate forecasts.The availability of massive data sets has revolutionized sports analytics, allowing for more complex and accurate models.Sports analytics models should consider factors like rest, travel, and altitude to capture the full picture of team performance.The impact of budget on team performance in American sports and the use of plus-minus models in basketball and American football are important considerations in sports analytics.The future of sports analytics lies in making analysis more accessible and digestible for everyday fans.There is a need for more focus on estimating distributions and variance around estimates in sports analytics.AI tools can empower analysts to do their own analysis and make better decisions, but it's important to ensure they understand the assumptions and structure of the data.Measuring the value of certain positions, such as midfielders in soccer, is a challenging problem in sports analytics.Game theory plays a significant role in sports strategies, and optimal strategies can change over time as the game evolves.Chapters
00:00 Introduction and Overview
09:27 The Power of Bayesian Analysis in Sports Modeling
16:28 The Revolution of Massive Data Sets in Sports Analytics
31:03 The Impact of Budget in Sports Analytics
39:35 Introduction to Sports Analytics
52:22 Plus-Minus Models in American Football
01:04:11 The Future of Sports Analytics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
LBS Sports Analytics playlist: https://www.youtube.com/playlist?list=PL7RjIaSLWh5kDiPVMUSyhvFaXL3NoXOe4Paul’s website: https://sabinanalytics.com/Paul on GitHub: https://github.com/sabinanalytics Paul on Linkedin: https://www.linkedin.com/in/rpaulsabin/Paul on Twitter: https://twitter.com/SabinAnalyticsPaul on Google Scholar: https://scholar.google.com/citations?user=wAezxZ4AAAAJ&hl=enSoccer Power Ratings & Projections: https://sabinanalytics.com/ratings/soccer/Estimating player value in American football using plus–minus models: https://www.degruyter.com/document/doi/10.1515/jqas-2020-0033/htmlWorld Football R Package: https://github.com/JaseZiv/worldfootballRTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Fri, 14 Jun 2024 - 1h 18min - 122 - #107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.
Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks.
He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.
A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy.
In his free time, Marvin enjoys board games and is a passionate guitar player.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
Amortized Bayesian inference combines deep learning and statistics to make posterior inference fast and trustworthy.Bayesian neural networks can be used for full Bayesian inference on neural network weights.Amortized Bayesian inference decouples the training phase and the posterior inference phase, making posterior sampling much faster.BayesFlow is a Python library for amortized Bayesian workflows, providing a user-friendly interface and modular architecture.Self-consistency loss is a technique that combines simulation-based inference and likelihood-based Bayesian inference, with a focus on amortizationThe BayesFlow package aims to make amortized Bayesian inference more accessible and provides sensible default values for neural networks.Deep fusion techniques allow for the fusion of multiple sources of information in neural networks.Generative models that are expressive and have one-step inference are an emerging topic in deep learning and probabilistic machine learning.Foundation models, which have a large training set and can handle out-of-distribution cases, are another intriguing area of research.Chapters:
00:00 Introduction to Amortized Bayesian Inference
07:39 Bayesian Neural Networks
11:47 Amortized Bayesian Inference and Posterior Inference
23:20 BayesFlow: A Python Library for Amortized Bayesian Workflows
38:15 Self-consistency loss: Bridging Simulation-Based Inference and Likelihood-Based Bayesian Inference
41:35 Amortized Bayesian Inference
43:53 Fusing Multiple Sources of Information
45:19 Compensating for Missing Data
56:17 Emerging Topics: Expressive Generative Models and Foundation Models
01:06:18 The Future of Deep Learning and Probabilistic Machine Learning
Links from the show:
Marvin’s website: https://www.marvinschmitt.com/Marvin on GitHub: https://github.com/marvinschmittMarvin on Linkedin: https://www.linkedin.com/in/marvin-schmitt/Marvin on Twitter: https://twitter.com/MarvinSchmittMLThe BayesFlow package for amortized Bayesian workflows: https://bayesflow.org/ BayesFlow Forums for users: https://discuss.bayesflow.orgBayesFlow software paper (JOSS): https://joss.theoj.org/papers/10.21105/joss.05702Tutorial on amortized Bayesian inference with BayesFlow (Python): https://colab.research.google.com/drive/1ub9SivzBI5fMbSTwVM1pABsMlRupgqRb?usp=sharing Towards Reliable Amortized Bayesian Inference: https://www.marvinschmitt.com/speaking/pdf/slides_reliable_abi_botb.pdfExpand the model space that we amortize over (multiverse analyses, power scaling, …): “Sensitivity-Aware Amortized Bayesian Inference” https://arxiv.org/abs/2310.11122Use heterogeneous data sources in amortized inference: “Fuse It or Lose It: Deep Fusion for Multimodal Simulation-Based Inference” https://arxiv.org/abs/2311.10671Use likelihood density information (explicit or even learned on the fly): “Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference” https://arxiv.org/abs/2310.04395LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/LBS #101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/Deep Learning book: https://www.deeplearningbook.org/Statistical Rethinking: https://xcelab.net/rm/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 29 May 2024 - 1h 21min - 121 - #106 Active Statistics, Two Truths & a Lie, with Andrew Gelman
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIf there is one guest I don’t need to introduce, it’s mister Andrew Gelman. So… I won’t! I will refer you back to his two previous appearances on the show though, because learning from Andrew is always a pleasure. So go ahead and listen to episodes 20 and 27.
In this episode, Andrew and I discuss his new book, Active Statistics, which focuses on teaching and learning statistics through active student participation. Like this episode, the book is divided into three parts: 1) The ideas of statistics, regression, and causal inference; 2) The value of storytelling to make statistical concepts more relatable and interesting; 3) The importance of teaching statistics in an active learning environment, where students are engaged in problem-solving and discussion.
And Andrew is so active and knowledgeable that we of course touched on a variety of other topics — but for that, you’ll have to listen ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Active learning is essential for teaching and learning statistics.
- Storytelling can make statistical concepts more relatable and interesting.
- Teaching statistics in an active learning environment engages students in problem-solving and discussion.
- The book Active Statistics includes 52 stories, class participation activities, computer demonstrations, and homework assignments to facilitate active learning.
- Active learning, where students actively engage with the material through activities and discussions, is an effective approach to teaching statistics.
- The flipped classroom model, where students read and prepare before class and engage in problem-solving activities during class, can enhance learning and understanding.
- Clear organization and fluency in teaching statistics are important for student comprehension and engagement.
- Visualization plays a crucial role in understanding statistical concepts and aids in comprehension.
- The future of statistical education may involve new approaches and technologies, but the challenge lies in finding effective ways to teach basic concepts and make them relevant to real-world problems.
Chapters:
00:00 Introduction and Background
08:09 The Importance of Stories in Statistics Education
30:28 Using 'Two Truths and a Lie' to Teach Logistic Regression
38:08 The Power of Storytelling in Teaching Statistics
57:26 The Importance of Visualization in Understanding Statistics
01:07:03 The Future of Statistical Education
Links from the show:
Andrew’s website: http://www.stat.columbia.edu/~gelman/Andrew’s blog: https://statmodeling.stat.columbia.edu/ Twitter links to blog posts: https://twitter.com/statmodelingActive Statistics page: http://www.stat.columbia.edu/~gelman/active-statistics/“Two truths and a lie” as a class-participation activity: http://www.stat.columbia.edu/~gelman/research/published/truths_paper.pdfRohan Alexander’s book, Telling Stories with Data: https://tellingstorieswithdata.com/Use code ACTSTAT24 to buy Active Statistics with 20% off through July 15, 2024: www.cambridge.org/9781009436212LBS #27, Modeling the US Presidential Elections, with Andrew Gelman & Merlin Heidemanns: https://learnbayesstats.com/episode/27-modeling-the-us-presidential-elections-with-andrew-gelman-merlin-heidemanns/LBS #20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtari/Slamming the sham – A Bayesian model for adaptive adjustment with noisy control data: http://www.stat.columbia.edu/~gelman/research/unpublished/chickens.pdfTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 16 May 2024 - 1h 16min - 120 - #105 The Power of Bayesian Statistics in Glaciology, with Andy Aschwanden & Doug Brinkerhoff
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.
Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.
His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Computer models and data analysis play a crucial role in understanding glacier behavior and predicting sea level rise.
- Reliable data, especially on ice thickness and climate forcing, are essential for accurate modeling.
- The collaboration between glaciology and Bayesian statistics has led to breakthroughs in understanding glacier evolution forecasts.
-There is a need for open-source packages and tools to make glaciological models more accessible. Glaciology and ice sheet modeling are complex fields that require collaboration between domain experts and data scientists.
- The use of Bayesian statistics in glaciology allows for a probabilistic framework to understand and communicate uncertainty in predictions.
- Real-time forecasting of glacier behavior is an exciting area of research that could provide valuable information for communities living near glaciers.
-There is a need for further research in understanding existing data sets and developing simpler methods to analyze them.
- The future of glaciology research lies in studying Alaskan glaciers and understanding the challenges posed by the changing Arctic environment.
Chapters:
00:00 Introduction and Background
08:54 The Role of Statistics in Glaciology
31:46 Open-Source Packages and Tools
52:06 The Power of Bayesian Statistics in Glaciology
01:06:34 Understanding Existing Data Sets and Developing Simpler Methods
Links from the show:
Andy’s website: https://glaciers.gi.alaska.edu/people/aschwandenDoug’s website: https://dbrinkerhoff.org/Andy on GitHub: https://github.com/aaschwanden Doug on GitHub: https://github.com/douglas-brinkerhoff/Andy on Twitter: https://twitter.com/glacierandy?lang=frAndy on Google Scholar: https://scholar.google.com/citations?user=CuvsLvMAAAAJ&hl=enDoug on Google Scholar: https://scholar.google.com/citations?user=FqU6ON8AAAAJ&hl=enLBS #64, Modeling the Climate & Gravity Waves, with Laura Mansfield: https://learnbayesstats.com/episode/64-modeling-climate-gravity-waves-laura-mansfield/Parallel Ice Sheet Model: www.pism.ioPISM on GitHub: https://github.com/pism/pismGreenland View of Three Simulated Greenland Ice Sheet Response Scenarios: https://svs.gsfc.nasa.gov/4727/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 02 May 2024 - 1h 15min - 119 - #104 Automated Gaussian Processes & Sequential Monte Carlo, with Feras Saad
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meGPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?
Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.
Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.
Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- AutoGP is a Julia package for automatic Gaussian process modeling that learns the structure of GP kernels automatically.
- It addresses the challenge of making structural choices for covariance functions by using a symbolic language and a recursive grammar to infer the expression of the covariance function given the observed data.
-AutoGP incorporates sequential Monte Carlo inference to handle scalability and uncertainty in structure learning.
- The package is implemented in Julia using the Gen probabilistic programming language, which provides support for sequential Monte Carlo and involutive MCMC.
- Sequential Monte Carlo (SMC) and inductive MCMC are used in AutoGP to infer the structure of the model.
- Integrating probabilistic models with language models can improve interpretability and trustworthiness in data-driven inferences.
- Challenges in Bayesian workflows include the need for automated model discovery and scalability of inference algorithms.
- Future developments in probabilistic reasoning systems include unifying people around data-driven inferences and improving the scalability and configurability of inference algorithms.
Chapters:
00:00 Introduction to AutoGP
26:28 Automatic Gaussian Process Modeling
45:05 AutoGP: Automatic Discovery of Gaussian Process Model Structure
53:39 Applying AutoGP to New Settings
01:09:27 The Biggest Hurdle in the Bayesian Workflow
01:19:14 Unifying People Around Data-Driven Inferences
Links from the show:
Sign up to the Fast & Efficient Gaussian Processes modeling webinar: https://topmate.io/alex_andorra/901986Feras’ website: https://www.cs.cmu.edu/~fsaad/LBS #3.1, What is Probabilistic Programming & Why use it, with Colin Carroll: https://learnbayesstats.com/episode/3-1-what-is-probabilistic-programming-why-use-it-with-colin-carroll/LBS #3.2, How to use Bayes in industry, with Colin Carroll: https://learnbayesstats.com/episode/3-2-how-to-use-bayes-in-industry-with-colin-carroll/LBS #21, Gaussian Processes, Bayesian Neural Nets & SIR Models, with Elizaveta Semenova: https://learnbayesstats.com/episode/21-gaussian-processes-bayesian-neural-nets-sir-models-with-elizaveta-semenova/LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #63, Media Mix Models & Bayes for Marketing, with Luciano Paz: https://learnbayesstats.com/episode/63-media-mix-models-bayes-marketing-luciano-paz/LBS #83, Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo: https://learnbayesstats.com/episode/83-multilevel-regression-post-stratification-electoral-dynamics-tarmo-juristo/AutoGP.jl, A Julia package for learning the covariance structure of Gaussian process time series models: https://probsys.github.io/AutoGP.jl/stable/Sequential Monte Carlo Learning for Time Series Structure Discovery: https://arxiv.org/abs/2307.09607Street Epistemlogy: https://www.youtube.com/@magnabosco210You're not so smart Podcast: https://youarenotsosmart.com/podcast/How Minds Change: https://www.davidmcraney.com/howmindschangehomeJosh Tenebaum's lectures on computational cognitive science: https://www.youtube.com/playlist?list=PLUl4u3cNGP61RTZrT3MIAikp2G5EEvTjfTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 16 Apr 2024 - 1h 30min - 118 - #103 Improving Sampling Algorithms & Prior Elicitation, with Arto Klami
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meChanging perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.
He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.
Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.
When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.
- Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo algorithms to better model the posterior distribution and explore challenging areas of the parameter space.
- Normalizing flows are a complementary approach to Riemannian spaces, using non-linear transformations to warp the parameter space and improve sampling efficiency.
- Evaluating the performance of Bayesian inference algorithms in challenging cases is a current research challenge, and more work is needed to establish benchmarks and compare different methods.
- PreliZ is a package for prior elicitation in Bayesian modeling that facilitates communication with users through visualizations of predictive and parameter distributions.
- Careful prior specification is important, and tools like PreliZ make the process easier and more reproducible.
- Teaching Bayesian machine learning is challenging due to the combination of statistical and programming concepts, but it is possible to teach the basic reasoning behind Bayesian methods to a diverse group of students.
- The integration of Bayesian approaches in data science workflows is becoming more accepted, especially in industries that already use deep learning techniques.
- The future of Bayesian methods in AI research may involve the development of AI assistants for Bayesian modeling and probabilistic reasoning.
Chapters:
00:00 Introduction and Background
02:05 Arto's Work and Background
06:05 Introduction to Bayesian Inference
12:46 Riemannian Spaces in Bayesian Inference
27:24 Availability of Romanian-based Algorithms
30:20 Practical Applications and Evaluation
37:33 Introduction to Prelease
38:03 Prior Elicitation
39:01 Predictive Elicitation Techniques
39:30 PreliZ: Interface with Users
40:27 PreliZ: General Purpose Tool
41:55 Getting Started with PreliZ
42:45 Challenges of Setting Priors
45:10 Reproducibility and Transparency in Priors
46:07 Integration of Bayesian Approaches in Data Science Workflows
55:11 Teaching Bayesian Machine Learning
01:06:13 The Future of Bayesian Methods with AI Research
01:10:16 Solving the Prior Elicitation Problem
Links from the show:
LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtari/LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/Arto’s website: https://www.cs.helsinki.fi/u/aklami/Arto on Google Scholar: https://scholar.google.com/citations?hl=en&user=v8PeLGgAAAAJMulti-source probabilistic inference Group: https://www.helsinki.fi/en/researchgroups/multi-source-probabilistic-inferenceFCAI web page: https://fcai.fiProbabilistic AI summer school lectures: https://www.youtube.com/channel/UCcMwNzhpePJE3xzOP_3pqswKeynote: "Better priors for everyone" by Arto Klami: https://www.youtube.com/watch?v=mEmiEHsfWyc&ab_channel=ProbabilisticAISchoolVariational Inference and Optimization I by Arto Klami: https://www.youtube.com/watch?v=60USDNc1nE8&list=PLRy-VW__9hV8s--JkHXZvnd26KgjRP2ik&index=3&ab_channel=ProbabilisticAISchoolPreliZ, A tool-box for prior elicitation: https://preliz.readthedocs.io/en/latest/AISTATS paper that presents the new computationally efficient metric in context of MCMC: https://researchportal.helsinki.fi/en/publications/lagrangian-manifold-monte-carlo-on-monge-patchesTMLR paper that scales up the solution for larger models, using the metric for sampling-based inference in deel learning: https://openreview.net/pdf?id=dXAuvo6CGIRiemannian Laplace approximation (to appear in AISTATS’24): https://arxiv.org/abs/2311.02766Prior Knowledge Elicitation -- The Past, Present, and Future: https://projecteuclid.org/journals/bayesian-analysis/advance-publication/Prior-Knowledge-Elicitation-The-Past-Present-and-Future/10.1214/23-BA1381.fullTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Fri, 05 Apr 2024 - 1h 14min - 117 - #102 Bayesian Structural Equation Modeling & Causal Inference in Psychometrics, with Ed Merkle
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meStructural Equation Modeling (SEM) is a key framework in causal inference. As I’m diving deeper and deeper into these topics to teach them and, well, finally understand them, I was delighted to host Ed Merkle on the show.
A professor of psychological sciences at the University of Missouri, Ed discusses his work on Bayesian applications to psychometric models and model estimation, particularly in the context of Bayesian SEM. He explains the importance of BSEM in psychometrics and the challenges encountered in its estimation.
Ed also introduces his blavaan package in R, which enhances researchers' capabilities in BSEM and has been instrumental in the dissemination of these methods. Additionally, he explores the role of Bayesian methods in forecasting and crowdsourcing wisdom.
When he’s not thinking about stats and psychology, Ed can be found running, playing the piano, or playing 8-bit video games.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Bayesian SEM is a powerful framework in psychometrics that allows for the estimation of complex models involving multiple variables and causal relationships.
- Understanding the principles of Bayesian inference is crucial for effectively applying Bayesian SEM in psychological research.
- Informative priors play a key role in Bayesian modeling, providing valuable information and improving the accuracy of model estimates.
- Challenges in BSEM estimation include specifying appropriate prior distributions, dealing with unidentified parameters, and ensuring convergence of the model. Incorporating prior information is crucial in Bayesian modeling, especially when dealing with large models and imperfect data.
- The blavaan package enhances researchers' capabilities in Bayesian structural equation modeling, providing a user-friendly interface and compatibility with existing frequentist models.
- Bayesian methods offer advantages in forecasting and subjective probability by allowing for the characterization of uncertainty and providing a range of predictions.
- Interpreting Bayesian model results requires careful consideration of the entire posterior distribution, rather than focusing solely on point estimates.
- Latent variable models, also known as structural equation models, play a crucial role in psychometrics, allowing for the estimation of unobserved variables and their influence on observed variables.
- The speed of MCMC estimation and the need for a slower, more thoughtful workflow are common challenges in the Bayesian workflow.
- The future of Bayesian psychometrics may involve advancements in parallel computing and GPU-accelerated MCMC algorithms.
Chapters:
00:00 Introduction to the Conversation
02:17 Background and Work on Bayesian SEM
04:12 Topics of Focus: Structural Equation Models
05:16 Introduction to Bayesian Inference
09:30 Importance of Bayesian SEM in Psychometrics
10:28 Overview of Bayesian Structural Equation Modeling (BSEM)
12:22 Relationship between BSEM and Causal Inference
15:41 Advice for Learning BSEM
21:57 Challenges in BSEM Estimation
34:40 The Impact of Model Size and Data Quality
37:07 The Development of the Blavaan Package
42:16 Bayesian Methods in Forecasting and Subjective Probability
46:27 Interpreting Bayesian Model Results
51:13 Latent Variable Models in Psychometrics
56:23 Challenges in the Bayesian Workflow
01:01:13 The Future of Bayesian Psychometrics
Links from the show:
Ed’s website: https://ecmerkle.github.io/Ed on Mastodon: https://mastodon.sdf.org/@edgarmerkleEd on BlueSky: @edgarmerkle.bsky.socialEd on GitHub: https://github.com/ecmerkleblaavan R package: https://ecmerkle.github.io/blavaan/Resources on how to use blaavan: https://ecmerkle.github.io/blavaan/articles/resources.htmlRichard McElreath, Table 2 Fallacy: https://youtu.be/uanZZLlzKHw?si=vssrwJsvGO5HhH5H&t=4323Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 20 Mar 2024 - 1h 08min - 116 - How to find black holes with Bayesian inference
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode:https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/
Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlik
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Sat, 16 Mar 2024 - 12min - 115 - How can we even hear gravitational waves?
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode:https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/
Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlik
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Thu, 14 Mar 2024 - 08min - 114 - #101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, we dive deep into gravitational wave astronomy, with Christopher Berry and John Veitch, two senior lecturers at the University of Glasgow and experts from the LIGO-VIRGO collaboration. They explain the significance of detecting gravitational waves, which are essential for understanding black holes and neutron stars collisions. This research not only sheds light on these distant events but also helps us grasp the fundamental workings of the universe.
Our discussion focuses on the integral role of Bayesian statistics, detailing how they use nested sampling for extracting crucial information from the subtle signals of gravitational waves. This approach is vital for parameter estimation and understanding the distribution of cosmic sources through population inferences.
Concluding the episode, Christopher and John highlight the latest advancements in black hole astrophysics and tests of general relativity, and touch upon the exciting prospects and challenges of the upcoming space-based LISA mission.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
⁃ Gravitational wave analysis involves using Bayesian statistics for parameter estimation and population inference.
⁃ Nested sampling is a powerful algorithm used in gravitational wave analysis to explore parameter space and calculate the evidence for model selection.
⁃ Machine learning techniques, such as normalizing flows, can be integrated with nested sampling to improve efficiency and explore complex distributions.
⁃ The LIGO-VIRGO collaboration operates gravitational wave detectors that measure distortions in space and time caused by black hole and neutron star collisions.
⁃ Sources of noise in gravitational wave detection include laser noise, thermal noise, seismic motion, and gravitational coupling.
⁃ The LISA mission is a space-based gravitational wave detector that aims to observe lower frequency gravitational waves and unlock new astrophysical phenomena.
⁃ Space-based detectors like LISA can avoid the ground-based noise and observe a different part of the gravitational wave spectrum, providing new insights into the universe.
⁃ The data analysis challenges for space-based detectors are complex, as they require fitting multiple sources simultaneously and dealing with overlapping signals.
⁃ Gravitational wave observations have the potential to test general relativity, study the astrophysics of black holes and neutron stars, and provide insights into cosmology.
Links from the show:
Christopher’s’ website: https://cplberry.com/John’s website: https://www.veitch.me.uk/Christopher on GitHub: https://github.com/cplb/ John on GitHub: https://github.com/johnveitchChristopher on Linkedin: http://www.linkedin.com/in/cplberry John on Linkedin: https://www.linkedin.com/in/john-veitch-56772225/Christopher on Twitter: https://twitter.com/cplberryJohn on Twitter: https://twitter.com/johnveitchChristopher on Mastodon: https://mastodon.scot/@cplberry@mastodon.online John on Mastodon: https://mastodon.scot/@JohnVeitchLIGO website: https://www.ligo.org/LIGO Gitlab: https://git.ligo.org/users/sign_inGravitational Wave Open Science Center: https://gwosc.org/LIGO Caltech Lab: https://www.ligo.caltech.edu/page/ligo-dataExoplanet, python package for probabilistic modeling of time series data in astronomy: https://docs.exoplanet.codes/en/latest/Dynamic Nested Sampling with dynesty: https://dynesty.readthedocs.io/en/latest/dynamic.htmlNessai, Nested sampling with artificial intelligence: https://nessai.readthedocs.io/LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/bayeux, JAX models with state-of-the-art inference methods: https://jax-ml.github.io/bayeux/LBS #51 Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/Aubrey Clayton's Probability Theory Lectures based on E.T Jaynes book: https://www.youtube.com/playlist?list=PL9v9IXDsJkktefQzX39wC2YG07vw7DsQ_Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 07 Mar 2024 - 1h 09min - 113 - The Role of Variational Inference in Reactive Message Passing
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/100-reactive-message-passing-automated-inference-in-julia-dmitry-bagaev/
Watch the interview: https://www.youtube.com/watch?v=ZG3H0xxCXTQ
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Fri, 01 Mar 2024 - 10min - 112 - Reactive Message Passing in Bayesian Inference
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/100-reactive-message-passing-automated-inference-in-julia-dmitry-bagaev/
Watch the interview: https://www.youtube.com/watch?v=ZG3H0xxCXTQ
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Wed, 28 Feb 2024 - 08min - 111 - #100 Reactive Message Passing & Automated Inference in Julia, with Dmitry Bagaev
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Dmitry Bagaev discusses his work in Bayesian statistics and the development of RxInfer.jl, a reactive message passing toolbox for Bayesian inference.
Dmitry explains the concept of reactive message passing and its applications in real-time signal processing and autonomous systems. He discusses the challenges and benefits of using RxInfer.jl, including its scalability and efficiency in large probabilistic models.
Dmitry also shares insights into the trade-offs involved in Bayesian inference architecture and the role of variational inference in RxInfer.jl. Additionally, he discusses his startup Lazy Dynamics and its goal of commercializing research in Bayesian inference.
Finally, we also discuss the user-friendliness and trade-offs of different inference methods, the future developments of RxInfer, and the future of automated Bayesian inference.
Coming from a very small town in Russia called Nizhnekamsk, Dmitry currently lives in the Netherlands, where he did his PhD. Before that, he graduated from the Computational Science and Modeling department of Moscow State University.
Beyond that, Dmitry is also a drummer (you’ll see his cool drums if you’re watching on YouTube), and an adept of extreme sports, like skydiving, wakeboarding and skiing!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Reactive message passing is a powerful approach to Bayesian inference that allows for real-time updates and adaptivity in probabilistic models.
- RxInfer.jl is a toolbox for reactive message passing in Bayesian inference, designed to be scalable, efficient, and adaptable.
- Julia is a preferred language for RxInfer.jl due to its speed, macros, and multiple dispatch, which enable efficient and flexible implementation.
- Variational inference plays a crucial role in RxInfer.jl, allowing for trade-offs between computational complexity and accuracy in Bayesian inference.
- Lazy Dynamics is a startup focused on commercializing research in Bayesian inference, with the goal of making RxInfer.jl accessible and robust for industry applications.
Links from the show:
LBS Physics & Astrophysics playlist: https://learnbayesstats.com/physics-astrophysics/LBS #51, Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/Dmitry on GitHub: https://github.com/bvdmitriDmitry on LinkedIn: https://www.linkedin.com/in/bvdmitri/RxInfer.jl, Automatic Bayesian Inference through Reactive Message Passing: https://rxinfer.ml/Reactive Bayes, Open source software for reactive, efficient and scalable Bayesian inference: https://github.com/ReactiveBayesLazyDynamics, Reactive Bayesian AI: https://lazydynamics.com/BIASlab, Natural Artificial Intelligence: https://biaslab.github.io/Dmitry's PhD dissertation: https://research.tue.nl/en/publications/reactive-probabilistic-programming-for-scalable-bayesian-inferencEffortless Mastery, by Kenny Werner: https://www.amazon.com/Effortless-Mastery-Liberating-Master-Musician/dp/156224003XThe Book of Why, by Judea Pearl: https://www.amazon.com/Book-Why-Science-Cause-Effect/dp/046509760XBernoulli’s Fallacy, by Aubrey Clayton: https://www.amazon.com/Bernoullis-Fallacy-Statistical-Illogic-Science/dp/0231199945Software Engineering for Science: https://www.amazon.com/Software-Engineering-Science-Chapman-Computational/dp/1498743854Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 21 Feb 2024 - 54min - 110 - The biggest misconceptions about Bayes & Quantum Physics
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/99-exploring-quantum-physics-bayesian-stats-chris-ferrie/
Watch the interview: https://www.youtube.com/watch?v=pRaT6FLF7A8
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Fri, 16 Feb 2024 - 09min - 109 - Why would you use Bayesian Statistics?
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/99-exploring-quantum-physics-bayesian-stats-chris-ferrie/
Watch the interview: https://www.youtube.com/watch?v=pRaT6FLF7A8
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Wed, 14 Feb 2024 - 10min - 108 - #99 Exploring Quantum Physics with Bayesian Stats, with Chris Ferrie
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meYou know I’m a big fan of everything physics. So when I heard that Bayesian stats was especially useful in quantum physics, I had to make an episode about it!
You’ll hear from Chris Ferrie, an Associate Professor at the Centre for Quantum Software and Information of the University of Technology Sydney. Chris also has a foot in industry, as a co-founder of Eigensystems, an Australian start-up with a mission to democratize access to quantum computing.
Of course, we talked about why Bayesian stats are helpful in quantum physics research, and about the burning challenges in this line of research.
But Chris is also a renowned author — in addition to writing Bayesian Probability for Babies, he is the author of Quantum Physics for Babies and Quantum Bullsh*t: How to Ruin Your Life With Advice from Quantum Physics. So we ended up talking about science communication, science education, and a shocking revelation about Ant Man…
A big thank you to one of my best Patrons, Stefan Lorenz, for recommending me an episode with Chris!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
Quantum computing has the potential to revolutionize various industries, but it requires specialized tools and education to fully harness its power.Bayesian inference plays a crucial role in understanding and solving problems in quantum physics, particularly in parameter estimation and model building.The field of quantum physics faces challenges in experimental design, data collection, and maintaining the state of isolated quantum systems.There is a need for specialized software that can accommodate the unique constraints and models in quantum physics, allowing for more efficient and accurate analysis.Common misconceptions in quantum physics include the idea of superposition as being in two places at once and the misinterpretation of quantum experiments. Misconceptions in quantum physics and Bayesian probability are common and can be addressed through clear explanations and analogies.Communicating scientific concepts to the general public requires bridging the gap between scientific papers and mainstream media.Simplifying complex topics for young minds involves providing relatable examples, analogies, and categories.Studying mathematics is essential for a deeper understanding of quantum physics and statistics.Taking risks and making mistakes is encouraged in the early stages of a scientific career.Links from the show:
Chris’ website: https://www.csferrie.com/Chris on Linkedin: https://www.linkedin.com/in/christopher-ferrie-63993190/Chris on Twitter: https://twitter.com/csferrieChris on Instagram: https://www.instagram.com/drchrisferrie/Chris’ YouTube channel: https://www.youtube.com/csferrieChris’ children’s books: https://www.amazon.com/gp/product/149267043XHow quantum mechanics turned me into a Bayesian: https://csferrie.medium.com/how-quantum-mechanics-turned-me-into-a-bayesian-655ddf88051fExoplanet, a python package for probabilistic modeling of time series data in astronomy: https://docs.exoplanet.codes/en/latest/Quantum Bullsh*t – How to Ruin Your Life with Advice from Quantum Physics : https://www.goodreads.com/en/book/show/61263731LBS #93, A CERN Odyssey, with Kevin Greif: https://www.youtube.com/watch?v=rOaqIIEtdpILBS #72, Why the Universe is so Deliciously Crazy, with Daniel Whiteson: https://learnbayesstats.com/episode/72-why-the-universe-is-so-deliciously-crazy-daniel-whiteson/LBS #97, Probably Overthinking Statistical Paradoxes, with Allen Downey: https://learnbayesstats.com/episode/97-probably-overthinking-statistical-paradoxes-allen-downey/LBS #51, Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/LBS #50, Ta(l)king Risks & Embracing Uncertainty, with David Spiegelhalter: https://learnbayesstats.com/episode/50-talking-risks-embracing-uncertainty-david-spiegelhalter/Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Fri, 09 Feb 2024 - 1h 07min - 107 - How do sampling algorithms scale?
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/
Watch the interview: https://www.youtube.com/watch?v=vVqZ0WWXX7g
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Mon, 05 Feb 2024 - 09min - 106 - Why choose new algorithms instead of HMC?
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/
Watch the interview: https://www.youtube.com/watch?v=vVqZ0WWXX7g
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Sun, 04 Feb 2024 - 08min - 105 - #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meHow does the world of statistical physics intertwine with machine learning, and what groundbreaking insights can this fusion bring to the field of artificial intelligence?
In this episode, we delve into these intriguing questions with Marylou Gabrié. an assistant professor at CMAP, Ecole Polytechnique in Paris. Having completed her PhD in physics at École Normale Supérieure, Marylou ventured to New York City for a joint postdoctoral appointment at New York University’s Center for Data Science and the Flatiron’s Center for Computational Mathematics.
As you’ll hear, her research is not just about theoretical exploration; it also extends to the practical adaptation of machine learning techniques in scientific contexts, particularly where data is scarce.
In this conversation, we’ll traverse the landscape of Marylou's research, discussing her recent publications and her innovative approaches to machine learning challenges, latest MCMC advances, and ML-assisted scientific computing.
Beyond that, get ready to discover the person behind the science – her inspirations, aspirations, and maybe even what she does when not decoding the complexities of machine learning algorithms!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways
Developing methods that leverage machine learning for scientific computing can provide valuable insights into high-dimensional probabilistic models.Generative models can be used to speed up Markov Chain Monte Carlo (MCMC) methods and improve the efficiency of sampling from complex distributions.The Adaptive Monte Carlo algorithm augmented with normalizing flows offers a powerful approach for sampling from multimodal distributions.Scaling the algorithm to higher dimensions and handling discrete parameters are ongoing challenges in the field.Open-source packages, such as Flow MC, provide valuable tools for researchers and practitioners to adopt and contribute to the development of new algorithms. The scaling of algorithms depends on the quantity of parameters and data. While some methods work well with a few hundred parameters, larger quantities can lead to difficulties.Generative models, such as normalizing flows, offer benefits in the Bayesian context, including amortization and the ability to adjust the model with new data.Machine learning and MCMC are complementary and should be used together rather than replacing one another.Machine learning can assist scientific computing in the context of scarce data, where expensive experiments or numerics are required.The future of MCMC lies in the exploration of sampling multimodal distributions and understanding resource limitations in scientific research.Links from the show:
Marylou’s website: https://marylou-gabrie.github.io/Marylou on Linkedin: https://www.linkedin.com/in/marylou-gabri%C3%A9-95366172/Marylou on Twitter: https://twitter.com/marylougabMarylou on Github: https://github.com/marylou-gabrieMarylou on Google Scholar: https://scholar.google.fr/citations?hl=fr&user=5m1DvLwAAAAJAdaptive Monte Carlo augmented with normalizing flows: https://arxiv.org/abs/2105.12603Normalizing-flow enhanced sampling package for probabilistic inference: https://flowmc.readthedocs.io/en/main/Flow-based generative models for Markov chain Monte Carlo in lattice field theory: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.100.034515Boltzmann generators – Sampling equilibrium states of many-body systems with deep learning: https://www.science.org/doi/10.1126/science.aaw1147Solving Statistical Mechanics Using Variational Autoregressive Networks: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.122.080602An example of discrete version of similar algorithms: https://journals.aps.org/prresearch/abstract/10.1103/PhysRevResearch.3.L042024Grothendieck's conference: https://www.youtube.com/watch?v=ZW9JpZXwGXcTranscript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 24 Jan 2024 - 1h 05min - 104 - Why Even Care About Science & Rationality
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/97-probably-overthinking-statistical-paradoxes-allen-downey/
Watch the interview: https://www.youtube.com/watch?v=KgesIe3hTe0
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Sat, 20 Jan 2024 - 09min - 103 - How To Get Into Causal Inference
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/97-probably-overthinking-statistical-paradoxes-allen-downey/
Watch the interview: https://www.youtube.com/watch?v=KgesIe3hTe0
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Wed, 17 Jan 2024 - 10min - 102 - #97 Probably Overthinking Statistical Paradoxes, with Allen Downey
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, I had the pleasure of speaking with Allen Downey, a professor emeritus at Olin College and a curriculum designer at Brilliant.org. Allen is a renowned author in the fields of programming and data science, with books such as "Think Python" and "Think Bayes" to his credit. He also authors the blog "Probably Overthinking It" and has a new book by the same name, which he just released in December 2023.
In this conversation, we tried to help you differentiate between right and wrong ways of looking at statistical data, discussed the Overton paradox and the role of Bayesian thinking in it, and detailed a mysterious Bayesian killer app!
But that’s not all: we even addressed the claim that Bayesian and frequentist methods often yield the same results — and why it’s a false claim. If that doesn’t get you to listen, I don’t know what will!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
LBS #41, Thinking Bayes, with Allen Downey: https://learnbayesstats.com/episode/41-think-bayes-allen-downey/Allen’s blog: https://www.allendowney.com/blog/Allen on Twitter: https://twitter.com/allendowneyAllen on GitHub: https://github.com/AllenDowneyOrder Allen’s book, Probably Overthinking It, at a 30% discount with the code UCPNEW: https://press.uchicago.edu/ucp/books/book/chicago/P/bo206532752.htmlThe Bayesian Killer App: https://www.allendowney.com/blog/2023/03/20/the-bayesian-killer-app/Bayesian and Frequentist Results Are Not the Same, Ever: https://www.allendowney.com/blog/2021/04/25/bayesian-and-frequentist-results-are-not-the-same-ever/Allen’s presentation on the Overton paradox: https://docs.google.com/presentation/d/1-Uvby1Lfe1BTsxNv5R6PhXfwkLUgsyJgdkKtO8nUfJo/edit#slide=id.g291c5d4559e_0_0Video on the Overton Paradox, from PyData NYC 2022: https://youtu.be/VpuWECpTxmMThompson sampling as a dice game: https://allendowney.github.io/TheShakes/Causal quartets – Different ways to attain the same average treatment effect: http://www.stat.columbia.edu/~gelman/research/unpublished/causal_quartets.pdfLBS #89, Unlocking the Science of Exercise, Nutrition & Weight Management, with Eric Trexler: https://learnbayesstats.com/episode/89-unlocking-science-exercise-nutrition-weight-management-eric-trexler/How Minds Change, David McRaney: https://www.davidmcraney.com/howmindschangehomeAbstract
We are happy to welcome Allen Downey back to ur show and he has great news for us: His new book “Probably Overthinking It” is available now.
You might know Allen from his blog by the same name or his previous work. Or maybe you watched some of his educational videos which he produces in his new position at brilliant.org.
We delve right into exciting topics like collider bias and how it can explain the “low brith weight paradox” and other situations that only seem paradoxical at first, until you apply causal thinking to it.
Another classic Allen can unmystify for us is Simpson’s paradox. The problem is not the data, but your expectations of the data. We talk about some cases of Simpson’s paradox, for example from statistics on the Covid-19 pandemic, also featured in his book.
We also cover the “Overton paradox” - which Allen named himself - on how people report their ideologies as liberal or conservative over time.
Next to casual thinking and statistical paradoxes, we return to the common claim that frequentist statistics and Bayesian statistics often give the same results. Allen explains that they are fundamentally different and that Bayesian should not shy away from pointing that out and to emphasise the strengths of their methods.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 09 Jan 2024 - 1h 12min - 101 - How to Choose & Use Priors, with Daniel Lee
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/96-pharma-models-sports-analytics-stan-news-daniel-lee/
Watch the interview: https://www.youtube.com/watch?v=lnq5ZPlup0E
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Wed, 20 Dec 2023 - 09min - 100 - Becoming a Good Bayesian & Choosing Mentors, with Daniel Lee
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/96-pharma-models-sports-analytics-stan-news-daniel-lee/
Watch the interview: https://www.youtube.com/watch?v=lnq5ZPlup0E
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Wed, 13 Dec 2023 - 09min - 99 - #96 Pharma Models, Sports Analytics & Stan News, with Daniel Lee
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meGetting Daniel Lee on the show is a real treat — with 20 years of experience in numeric computation; 10 years creating and working with Stan; 5 years working on pharma-related models, you can ask him virtually anything. And that I did…
From joint models for estimating oncology treatment efficacy to PK/PD models; from data fusion for U.S. Navy applications to baseball and football analytics, as well as common misconceptions or challenges in the Bayesian world — our conversation spans a wide range of topics that I’m sure you’ll appreciate!
Daniel studied Mathematics at MIT and Statistics at Cambridge University, and, when he’s not in front of his computer, is a savvy basketball player and… a hip hop DJ — you actually have his SoundCloud profile in the show notes if you’re curious!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Daniel on Linkedin: https://www.linkedin.com/in/syclik/Daniel on Twitter: https://twitter.com/djsyclikDaniel on GitHub: https://github.com/syclikDaniel's DJ profile: https://soundcloud.com/dj-syclikLBS #91, Exploring European Football Analytics, with Max Göbel: https://learnbayesstats.com/episode/91-exploring-european-football-analytics-max-gobel/LBS #85, A Brief History of Sports Analytics, with Jim Albert: https://learnbayesstats.com/episode/85-brief-history-sports-analytics-jim-albert/Daniel about GPTs in Probabilistic Programming: https://www.youtube.com/watch?v=KUuSwLMFPHMLBS #50, Ta(l)king Risks & Embracing Uncertainty, with David Spiegelhalter: https://learnbayesstats.com/episode/50-talking-risks-embracing-uncertainty-david-spiegelhalter/LBS #76, The Past, Present & Future of Stan, with Bob Carpenter: https://learnbayesstats.com/episode/76-past-present-future-of-stan-bob-carpenter/LBS #27, Modeling the US Presidential Elections, with Andrew Gelman & Merlin Heidemanns: https://learnbayesstats.com/episode/27-modeling-the-us-presidential-elections-with-andrew-gelman-merlin-heidemanns/LBS #20, Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtari/Abstract
Our guest this week, Daniel Lee, is a real Bayesian allrounder and will give us new insights into a lot of Bayesian applications.
Daniel got introduced to Bayesian stats when trying to estimate the failure rate of satellite dishes as an undergraduate student. He was lucky to be mentored by Bayesian greats like David Spiegelhalter, Andrew Gelman and Bob Carpenter. He also sat in on reading groups at universities where he learned about cutting edge developments - something he would recommend anyone to really dive deep into the matter.
He used all this experience working on Pk/Pd (Pharmacokinetics/ Pharmacodynamics) models. We talk about the challenges in understanding individual responses to drugs based on the speed with which they move through the body. Bayesian statistics allows for incorporating more complexity into those models for more accurate estimation.
Daniel also worked on decision making and information fusing problems for the military, such as identifying a plane as friend or foe through the radar of several ships.
And to add even more diversity to his repertoire, Daniel now also works in the world of sports analytics, another popular topic on our show. We talk about the state of this emerging field and its challenges.
Finally, we cover some STAN news, discuss common problems and misconceptions around Bayesian statistics and how to resolve them.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 28 Nov 2023 - 55min - 98 - #95 Unraveling Cosmic Mysteries, with Valerie Domcke
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meWelcome to another installment of our LBS physics deep dive! After exploring the world of experimental physics at CERN in our first video documentary in episode 93, we’ll stay in Geneva for this one, but this time we’ll dive into theoretical physics.
We’ll explore mysterious components of the universe, like dark matter and dark energy. We’ll also see how the study of gravity intersects with the study of particle physics, especially when considering black holes and the early universe. Even crazier, we’ll see that there are actual experiments and observational projects going on to answer these fundamental questions!
Our guide for this episode is Valerie Domcke, permanent research staff member at CERN, who did her PhD in Hamburg, Germany, and postdocs in Trieste and Paris.
When she’s not trying to decipher the mysteries of the universe, Valerie can be found on boats, as she’s a big sailing fan.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates,Matt Niccolls and Maksim Kuznecov.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Valerie’s webpage: https://theory.cern/roster/domcke-valerieValerie on Google Scholar: https://scholar.google.com/citations?user=E3g0tn4AAAAJLBS #93 A CERN Odyssey, with Kevin Greiff: https://www.youtube.com/watch?v=rOaqIIEtdpILBS #64, Modeling the Climate & Gravity Waves, with Laura Mansfield: https://learnbayesstats.com/episode/64-modeling-climate-gravity-waves-laura-mansfield/LBS Physics Playlist: https://learnbayesstats.com/physics-astrophysics/Abstract
Episode 95 is another instalment of our Deep Dive into Physics series. And this time we move away from the empirical side of this topic towards more theoretical questions.
There is no one better for this topic than Dr. Valerie Domcke. Valerie is the second researcher from the CERN we have on our show. She is located at the Department of Theoretical Physics there.
We mainly focus on the Standard Model of Physics, where it fails to explain observations, what proposals are discussed to update or replace it and what kind of evidence would be needed to make such a decision.
Valerie is particularly interested in situations in which the Standard Model brakes down, such as when trying to explain the excess gravitational pull observed that cannot be accounted for by visible stars.
Of course, we cover fascinating topics like dark matter, dark energy, black holes and gravitational waves that are places to look for evidence against the Standard Model.
Looking more at the practical side of things, we discuss the challenges in disentangling signal from noise, especially in such complex fields as astro- and quantum-physics.
We also touch upon the challenges Valerie is currently tackling in working on a new observatory for gravitational waves, the Laser Interferometer Space Antenna, LISA.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 15 Nov 2023 - 1h 00min - 97 - #94 Psychometrics Models & Choosing Priors, with Jonathan Templin
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Jonathan Templin, Professor of Psychological and Quantitative Foundations at the University of Iowa, shares insights into his journey in the world of psychometrics.
Jonathan’s research focuses on diagnostic classification models — psychometric models that seek to provide multiple reliable scores from educational and psychological assessments. He also studies Bayesian statistics, as applied in psychometrics, broadly. So, naturally, we discuss the significance of psychometrics in psychological sciences, and how Bayesian methods are helpful in this field.
We also talk about challenges in choosing appropriate prior distributions, best practices for model comparison, and how you can use the Multivariate Normal distribution to infer the correlations between the predictors of your linear regressions.
This is a deep-reaching conversation that concludes with the future of Bayesian statistics in psychological, educational, and social sciences — hope you’ll enjoy it!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca and Dante Gates.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Jonathan’s website: https://jonathantemplin.com/Jonathan on Twitter: https://twitter.com/DrJTemplinJonathan on Linkedin: https://www.linkedin.com/in/jonathan-templin-0239b07/Jonathan on GitHub: https://github.com/jonathantemplinJonathan on Google Scholar: https://scholar.google.com/citations?user=veeVxxMAAAAJ&hl=en&authuser=1Jonathan on Youtube: https://www.youtube.com/channel/UC6WctsOhVfGW1D9NZUH1xFgJonathan’s book: https://jonathantemplin.com/diagnostic-measurement-theory-methods-applications/Jonathan’s teaching: https://jonathantemplin.com/teaching/Vehtari et al. (2016), Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC: https://arxiv.org/abs/1507.04544arviz.plot_compare: https://python.arviz.org/en/stable/api/generated/arviz.plot_compare.htmlLBS #35, The Past, Present & Future of BRMS, with Paul Bürkner: https://learnbayesstats.com/episode/35-past-present-future-brms-paul-burkner/LBS #40, Bayesian Stats for the Speech & Language Sciences, with Allison Hilger and Timo Roettger: https://learnbayesstats.com/episode/40-bayesian-stats-speech-language-sciences-allison-hilger-timo-roettger/Bayesian Model-Building Interface in Python: https://bambinos.github.io/bambi/Abstract
You have probably unknowingly already been exposed to this episode’s topic - psychometric testing - when taking a test at school or university. Our guest, Professor Jonathan Templin, tries to increase the meaningfulness of these tests by improving the underlying psychometric models, the bayesian way of course!
Jonathan explains that it is not easy to judge the ability of a student based on exams since they have errors and are only a snapshot. Bayesian statistics helps by naturally propagating this uncertainty to the results.
In the field of psychometric testing, Marginal Maximum Likelihood is commonly used. This approach quickly becomes unfeasible though when trying to marginalise over multidimensional test scores. Luckily, Bayesian probabilistic sampling does not suffer from this.
A further reason to prefer Bayesian statistics is that it provides a lot of information in the posterior. Imagine taking a test that tells you what profession you should pursue at the end of high school. The field with the best fit is of course interesting, but the second best fit may be as well. The posterior distribution can provide this kind of information.
After becoming convinced that Bayes is the right choice for psychometrics, we also talk about practical challenges like choosing a prior for the covariance in a multivariate normal distribution, model selection procedures and more.
In the end we learn about a great Bayesian holiday destination, so make sure to listen till the end!
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Tue, 24 Oct 2023 - 1h 06min - 96 - #93 A CERN Odyssey, with Kevin Greif
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meThis is a very special episode. It is the first-ever LBS video episode, and it takes place in the heart of particle physics research -- the CERN 🍾
I went onsite in Geneva, to visit Kevin Greif, a doctoral candidate in particle physics at UC Irvine, and we walked around the CERN campus, talking about particle physics, dark matter, dark energy, machine learning -- and a lot more!
I still released the audio form of this episode, but I really thought and made it as a video-first episode, so I strongly recommend watching this one, as you’ll get a cool tour of the CERN campus and some of its experiments ;) I put the YouTube link in the show notes.
I hope you'll enjoy this deep dive into all things physics. If you have any recommendations for other cool scientific places I should do a documentary about, please get in touch on Twitter @LearnBayesStats, or by email.
This was literally a one-person endeavor — you may have noticed that I edited the video myself. So, if you liked it, please send this episode to your friends and colleagues -- and tell them to support the show on Patreon 😉
With enough support, that means I'll be able to continue with such in-depth content, and maybe, maybe, even pay for a professional video editor next time 🙈
Enjoy, my dear Bayesians, and best Bayesian wishes 🖖
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Video of the episode: https://www.youtube.com/watch?v=rOaqIIEtdpIKevin on Linkedin:https://www.linkedin.com/in/kevin-greif-824b091b4/ Kevin on Twitter: https://twitter.com/greif_kevinATLAS homepage: https://atlas.cernKevin’s most recent paper on unfolding: https://arxiv.org/abs/2305.10399LBS on Twitter:https://twitter.com/LearnBayesStatsLBS on Linkedin: https://www.linkedin.com/company/91594158/admin/feed/posts/LBS Physics Playlist:https://learnbayesstats.com/physics-astrophysics/ LBS #72, Why the Universe is so Deliciously Crazy, with Daniel Whiteson:https://learnbayesstats.com/episode/72-why-the-universe-is-so-deliciously-crazy-daniel-whiteson/ CERN Website:https://www.home.cern/Some physics (and physics adjacent) books Kevin enjoys:
Copenhagen” by Michael Frayn: https://www.amazon.com/Copenhagen-Michael-Frayn/dp/0385720793“The Three Body Problem” by Cixin Liu https://www.goodreads.com/book/show/20518872-the-three-body-problem“The Aleph” by Jorge Luis Borges https://books.google.com.ar/books/about/The_Aleph_and_Other_Stories.html?id=XYZlAAAAMAAJ&redir_esc=yThe NOVA special that started it all for Kevin https://www.youtube.com/watch?v=V64toYdH9hUWed, 18 Oct 2023 - 1h 49min - 95 - #92 How to Make Decision Under Uncertainty, with Gerd Gigerenzer
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meI love Bayesian modeling. Not only because it allows me to model interesting phenomena and learn about the world I live in. But because it’s part of a broader epistemological framework that confronts me with deep questions — how do you make decisions under uncertainty? How do you communicate risk and uncertainty? What does being rational even mean?
Thankfully, Gerd Gigerenzer is there to help us navigate these fascinating topics. Gerd is the Director of the Harding Center for Risk Literacy of the University of Potsdam, Germany.
Also Director emeritus at the Max Planck Institute for Human Development, he is a former Professor of Psychology at the University of Chicago and Distinguished Visiting Professor at the School of Law of the University of Virginia.
Gerd has written numerous awarded articles and books, including Risk Savvy, Simple Heuristics That Make Us Smart, Rationality for Mortals, and How to Stay Smart in a Smart World.
As you’ll hear, Gerd has trained U.S. federal judges, German physicians, and top managers to make better decisions under uncertainty.
But Gerd is also a banjo player, has won a medal in Judo, and loves scuba diving, skiing, and, above all, reading.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Gerd’s website: https://www.mpib-berlin.mpg.de/staff/gerd-gigerenzerDo children have Bayesian intuitions: https://psycnet.apa.org/doiLanding?doi=10.1037%2Fxge0000979What are natural frequencies: https://www.bmj.com/content/343/bmj.d6386HIV screening: helping clinicians make sense of test results to patients: https://www.bmj.com/content/347/bmj.f5151Teaching Bayesian Reasoning in Less Than Two Hours: https://www.apa.org/pubs/journals/releases/xge-1303380.pdfHow to Stay Smart in a Smart World – Why Human Intelligence Still Beats Algorithms: https://www.amazon.com/How-Stay-Smart-World-Intelligence/dp/0262046954Gut Feelings – The Intelligence of the Unconscious: https://www.amazon.com/Gut-Feelings-Intelligence-Gerd-Gigerenzer/dp/0143113763Better Doctors, Better Patients, Better Decisions: https://www.amazon.com/Better-Doctors-Patients-Decisions-Envisioning/dp/026251852XLBS #50, Ta(l)king Risks & Embracing Uncertainty, with David Spiegelhalter: https://learnbayesstats.com/episode/50-talking-risks-embracing-uncertainty-david-spiegelhalter/LBS #87, Unlocking the Power of Bayesian Causal Inference, with Ben Vincent: https://learnbayesstats.com/episode/87-unlocking-the-power-of-bayesian-causal-inference-ben-vincent/As a bonus, Gerd playing the banjo: https://www.youtube.com/watch?v=qBllveuj8RIAbstract
In this episode, we have no other than Gerd Gigerenzer on the show, an expert in decision making, rationality and communicating risk and probabilities.
Gerd is a trained psychologist and worked at a number of distinguished institutes like the Max Planck Institute for Human Development in Berlin or the University of Chicago. He is director of the Harding Center for Risk Literacy in Potsdam.
One of his many topics of study are heuristics, a term often misunderstood, as he explains. We talk about the role of heuristics in a world of uncertainty, how it interacts with analysis and how it relates to intuition.
Another major topic of his work and this episode are natural frequencies and how they are a more natural way than conditional probabilities to express information such as the probability of having cancer after a positive screening.
Gerd studied the usefulness of natural frequencies in practice and contributed to them being taught in high school in Bavaria, Germany, as an important tool to navigate the real world.
In general, Gerd is passionate about not only researching these topics but also seeing them applied outside of academia. He taught thousands of medical doctors how to understand and communicate statistics and also worked on a number of economical decision making scenarios.
In the end we discuss the benefits of simpler models for complex, uncertain situations, as for example in the case of predicting flu seasons.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 04 Oct 2023 - 1h 04min - 94 - #91, Exploring European Football Analytics, with Max Göbel
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meAs you may know, I’m kind of a nerd. And I also love football — I've been a PSG fan since I’m 5 years old, so I’ve lived it all with this club.. And yet, I’ve never done a European-centered football analytics episode because, well, the US are much more advanced when it comes to sports analytics.
But today, I’m happy to say this day has come: a sports analytics episode where we can actually talk about European football. And that is thanks to Maximilan Göbel.
Max is a post-doctoral researcher in Economics and Finance at Bocconi University in Milan. Before that, he did his PhD in Economics at the Lisbon School of Economics and Management.
Max is a very passionate football fan and played himself for almost 25 years in his local football club. Unfortunately, he had to give it up when starting his PhD — don’t worry, he still goes to the gym, or goes running and sometimes cycling.
Max is also a great cook, inspired by all kinds of Italian food, and an avid podcast listener — from financial news, to health and fitness content, and even a mysterious and entertaining Bayesian podcast…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Max’s website: https://www.maximiliangoebel.com/homeMax on GitHub: https://github.com/maxi-tb22Max on LinkedIn: https://www.linkedin.com/in/maximilian-g%C3%B6bel-188b0413a/Max’s Soccer Analytics page: https://www.maximiliangoebel.com/soccer-analyticsSoccer Factor Model on GitHub: https://github.com/maxi-tb22/SFMMax webinar on his Soccer Factor Model: https://www.youtube.com/watch?v=2dGrN8JGd_wMax's paper using Bayesian inference:
VARCTIC - A Baysian Vector Autoregression for the Arctic: “Arctic Amplification of Anthropogenic Forcing: A Vector Autoregressive Analysis”: https://journals.ametsoc.org/view/journals/clim/34/13/JCLI-D-20-0324.1.xmlForecasting Arctic Sea Ice:
Daily predictions of Arctic Sea Ice Extent: https://chairemacro.esg.uqam.ca/arctic-sea-ice-forecasting/?lang=enSea Ice Outlook (SIO) Forecasting competition: https://www.arcus.org/sipn/sea-ice-outlookSome of Max’s coauthors:
Philippe Goulet Coulombe (UQAM): https://philippegouletcoulombe.com/Francis X. Diebold (UPenn): https://www.sas.upenn.edu/~fdiebold/Abstract
We already covered baseball analytics in the U.S.A. with Jim Albert in episode 85 and looked back at the decade long history of sports analytics there. How does it look like in Europe?
To talk about this we got Max Göbel on the show. Max is a post-doctoral researcher in Economics and Finance at Bocconi University in Milan and holds a PhD in Economics from the Lisbon School of Economics and Management.
What qualifies him to talk about the sports-side of sports analytics is his passion for football and decades of playing experience.
So, can sports analytics in Europe compete with analytics in the U.S.A.? Unfortunately, not yet. Many sports clubs do not use models in their hiring decisions, leading to suboptimal choices based on players’ reputation alone, as Max explains.
He designed a factor model for the performance of single players, borrowing from his econometrics expertise (check it out on his webpage, link in the show notes).
We talk about how to grow this model from a simple and straight-forward Bernoulli model for the rate of scored goals to a multilevel model, incorporating other players. And of course, we discuss the benefits for using Bayesian statistics for this modelling problem.
We also cover sport analytics more generally and why it may not be so widely used in European football clubs yet.
Besides his interest in football analytics, Max worked and works on topics in econometrics such as regression forecasting in the U.S.A., asset pricing and applying econometric methods to climate change issues like climate change forecasting and sea ice disappearance.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 20 Sep 2023 - 1h 04min - 93 - #90, Demystifying MCMC & Variational Inference, with Charles Margossian
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meWhat’s the difference between MCMC and Variational Inference (VI)? Why is MCMC called an approximate method? When should we use VI instead of MCMC?
These are some of the captivating (and practical) questions we’ll tackle in this episode. I had the chance to interview Charles Margossian, a research fellow in computational mathematics at the Flatiron Institute, and a core developer of the Stan software.
Charles was born and raised in Paris, and then moved to the US to pursue a bachelor’s degree in physics at Yale university. After graduating, he worked for two years in biotech, and went on to do a PhD in statistics at Columbia University with someone named… Andrew Gelman — you may have heard of him.
Charles is also specialized in pharmacometrics and epidemiology, so we also talked about some practical applications of Bayesian methods and algorithms in these fascinating fields.
Oh, and Charles’ life doesn’t only revolve around computers: he practices ballroom dancing and pickup soccer, and used to do improvised musical comedy!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Charles’ website: https://charlesm93.github.io/Charles on Twitter: https://twitter.com/charlesm993Charles on GitHub: https://github.com/charlesm93Charles on Google Scholar: https://scholar.google.com/citations?user=nPtLsvIAAAAJ&hl=enStan software: https://mc-stan.org/Torsten – Applications of Stan in Pharmacometrics: https://github.com/metrumresearchgroup/TorstenR̂ – Assessing the convergence of Markov chain Monte Carlo when running many short chains: https://arxiv.org/abs/2110.13017Revisiting the Gelman-Rubin Diagnostic: https://arxiv.org/abs/1812.09384An importance sampling approach for reliable and efficient inference in Bayesian ordinary differential equation models: https://arxiv.org/abs/2205.09059Pathfinder – Parallel quasi-Newton variational inference: https://arxiv.org/pdf/2108.03782.pdfBayesian workflow for disease transmission modeling in Stan: https://mc-stan.org/users/documentation/case-studies/boarding_school_case_study.htmlLBS #76 – The Past, Present & Future of Stan, with Bob Carpenter: https://learnbayesstats.com/episode/76-past-present-future-of-stan-bob-carpenter/LBS #51 – Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/Flatiron Institute: https://www.simonsfoundation.org/flatiron/Simons Foundation: https://www.simonsfoundation.org/Abstract
In episode 90 we cover both methodological advances and their application, namely variational inference and MCMC sampling and their application in pharmacometrics.
And we have just the right guest for this topic - Charles Margossian! You might know Charles from his work on STAN, his workshop teaching or his work at his current position at the Flatiron Institute.
His main focus now is on two topics: variational inference and MCMC sampling. When is variational inference (or approximate Bayesian methods) appropriate? And when does it fail? Charles answers these questions convincingly, clearing up some discussion around this topic.
In his work on MCMC, he tries to answer some fundamental questions: How much computational power should we invest? When is MCMC sampling more appropriate than approximate Bayesian methods? The short answer: when you care about quantifying uncertainty. We even talk about what the R-hat measure means and how to improve on it with nested R-hats.
After covering these two topics, we move to his practical work: pharmacometrics. For example, he worked on modelling the speed of drugs dissolving in the body or the role of genetics in the workings of drugs.
Charles also contributes to making Bayesian methods more accessible for pharmacologists: He co-developed the Torsten library for Stan that facilitates Bayesian analysis with pharmacometric data.
We discuss the nature of pharmacometric data and how it is usually modelled with Ordinary Differential Equations.
In the end we briefly cover one practical example of pharmacometric modelling: the Covid-19 pandemic.
All in all, episode 90 is another detailed one, covering many state-of-the-art techniques and their application.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 06 Sep 2023 - 1h 37min - 92 - #89 Unlocking the Science of Exercise, Nutrition & Weight Management, with Eric Trexler
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIf you’ve ever tried to lose fat or gain muscle, you may have noticed… it’s not easy. But it’s precisely its complexity that makes the science of exercise and nutrition fascinating.
This is the longest LBS episode so far, and you’ll understand why pretty quickly: we covered a very wide range of topics, starting with the concept of metabolic adaptation and how our physiology and brain react to caloric deficits or caloric surpluses.
We also talked about the connection between metabolic adaptation and exercise energy compensation, shedding light on the interactions between the two, and how they make weight management more complex.
Statistics are of utmost importance in these endeavors, so of course we touched on how Bayesian stats can help mitigate the challenges of low sample sizes and over-focus on average treatment effect.
My guest for this marathon episode, is no other than Eric Trexler. Currently at the Department of Evolutionary Anthropology of Duke University, Eric conducts research on metabolism and cardiometabolic health. He has a PhD in Human Movement Science from UNC Chapel Hill, and has published dozens of peer-reviewed research papers related to exercise, nutrition, and metabolism.
In addition, Eric is a former professional bodybuilder and has been coaching clients with goals related to health, fitness, and athletics since 2009.
In other words, get comfy for a broad and nerdy conversation about the mysteries related to energy expenditure regulation, weight management, and evolutionary mechanisms underpinning current health challenges.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Eric’s webpage:www.trexlerfitness.comMonthly Applications in Strength Sport (MASS) research review:https://massresearchreview.com/Eric on Twitter:https://twitter.com/EricTrexlerEric on Instagram:https://www.instagram.com/trexlerfitness/Eric on YouTube:https://www.youtube.com/@erictrexlerEric on Linkedin:https://www.linkedin.com/in/eric-trexler-19b8a9154/Eric’s research:https://www.researchgate.net/profile/Eric-TrexlerThe Metabolic Adaptation Manual – Problems, Solutions, and Life After Weight Loss:https://www.strongerbyscience.com/metabolic-adaptation/MASS on Instagram:https://www.instagram.com/massresearchreview/Burn – New Research Blows the Lid Off How We Really Burn Calories, Lose Weight, and Stay Healthy:https://www.amazon.com/Burn-Research-Really-Calories-Healthy/dp/0525541527Causal quartets – Different ways to attain the same average treatment effect:http://www.stat.columbia.edu/~gelman/research/unpublished/causal_quartets.pdfHow to Change – The Science of Getting from Where You Are to Where You Want to Be: https://www.amazon.com/How-Change-Science-Getting-Where/dp/059308375X/ref=tmm_hrd_swatch_0?_encoding=UTF8&qid=&sr=The Sweet Spot – The Pleasures of Suffering and the Search for Meaning: https://www.amazon.com/Sweet-Spot-Pleasures-Suffering-Meaning/dp/0062910566The Stoic Challenge – A Philosopher's Guide to Becoming Tougher, Calmer, and More Resilient: https://www.amazon.com/Stoic-Challenge-Philosophers-Becoming-Resilient/dp/0393652491LBS #61 Why we still use non-Bayesian methods, with EJ Wagenmakers: https://learnbayesstats.com/episode/61-why-we-still-use-non-bayesian-methods-ej-wagenmakers/LBS #35 The Past, Present & Future of BRMS, with Paul Bürkner: https://learnbayesstats.com/episode/35-past-present-future-brms-paul-burkner/Abstract
In episode 89, we cover a so-far underrepresented topic on this podcast: Nutrition science, sports science, their relation and of course, the role of Bayesian statistics in that field.
Eric Trexler is the one introducing us to this topic. With his PhD in Human Movement Science from UNC Chapel Hill, previous career as professional bodybuilder and extensive experience as a health and fitness coach, he is perfectly suited for the job.
We cover a lot of ground in this episode, focusing on the science of weight-loss and the challenges to losing weight after a certain point due to an adapted energy expenditure.
We look at energy expenditure and changes in metabolism from several angles, including the evolutionary background for these adaptations and how they affect us in modern times.
We also discuss how individually people react to calorie restriction or surplus, different approaches to motivate oneself to loose weight and the overall complexity of this topic.
In the later half of the episode, we focus more on the scientific practices in sports science and how they can be improved.
One way forward is, of course, to use more Bayesian statistics, especially because of the oftentimes small sample sizes in Eric’s field.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Wed, 23 Aug 2023 - 1h 59min - 91 - #88 Bridging Computation & Inference in Artificial Intelligent Systems, with Philipp Hennig
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meToday, we’re gonna learn about probabilistic numerics — what they are, what they are good for, and how they relate computation and inference in artificial intelligent systems.
To do this, I have the honor of hosting Philipp Hennig, a distinguished expert in this field, and the Chair for the Methods of Machine Learning at the University of Tübingen, Germany. Philipp studied in Heidelberg, also in Germany, and at Imperial College, London. Philipp received his PhD from the University of Cambridge, UK, under the supervision of David MacKay, before moving to Tübingen in 2011.
Since his PhD, he has been interested in the connection between computation and inference. With international colleagues, he helped establish the idea of probabilistic numerics, which describes computation as Bayesian inference. His book, Probabilistic Numerics — Computation as Machine Learning, co-authored with Mike Osborne and Hans Kersting, was published by Cambridge University Press in 2022 and is also openly available online.
So get comfy to explore the principles that underpin these algorithms, how they differ from traditional numerical methods, and how to incorporate uncertainty into the decision-making process of these algorithms.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Philipp on Twitter: https://twitter.com/PhilippHennig5Philipp on Github: https://github.com/philipphennigPhilipp on LinkedIn: https://www.linkedin.com/in/philipp-hennig-635832278/An introductory course on Probabilistic Numerics, taught collaboratively by Philipp’s Group: https://youtube.com/playlist?list=PL05umP7R6ij2lwDdj7IkuHoP9vHlEcH0s An introductory tutorial on Probabilistic Numerics: https://youtu.be/0Q1ZTLHULcw Philipp’s book: https://www.probabilistic-numerics.org/textbooks/ProbNum python package: https://probnum.readthedocs.io/en/latest/Probabilistic solvers for differential equations in JAX: https://pnkraemer.github.io/probdiffeq/Probabilistic Numerical Differential Equation Solvers in Julia: https://nathanaelbosch.github.io/ProbNumDiffEq.jl/stable/#Probabilistic-Numerical-Differential-Equation-SolversPhilipp’s research: https://www.probabilistic-numerics.org/Philipp’s academic page: https://uni-tuebingen.de/en/fakultaeten/mathematisch-naturwissenschaftliche-fakultaet/fachbereiche/informatik/lehrstuehle/methods-of-machine-learning/start/ Tübingen Machine Learning on YouTube: https://www.youtube.com/c/TübingenML LBS #74 Optimizing NUTS and Developing the ZeroSumNormal Distribution, with Adrian Seyboldt: https://learnbayesstats.com/episode/74-optimizing-nuts-developing-zerosumnormal-distribution-adrian-seyboldt/LBS #12 Biostatistics and Differential Equations, with Demetri Pananos: https://learnbayesstats.com/episode/12-biostatistics-and-differential-equations-with-demetri-pananos/Abstract
In episode 88 with Philipp Henning, chair of Methods in Machine Learning at the Eberhard Karls University Tübingen, we learn about new, technical areas for the Bayesian way of thinking: Probabilistic numerics.
Philipp gives us a conceptual introduction to Machine Learning as “refining a model through data” and explains what challenges Machine Learning phases due to the intractable nature of data and the used computations.
The Bayesian approach, emphasising uncertainty over estimates and parameters, naturally lends itself for handling these issues.
In his research group, Philipp tries to find more general implementations of classically used algorithms, while maintaining computational efficiency. They successfully achieve this goal by bringing in the Bayesian approach to inferences.
Philipp explains probabilistic numerics as “redescrbiing everything a computer does as Bayesian inference” and how this approach is suitable for advancing Machine Learning.
We expand on how to handle uncertainty in machine learning and Philipp details his teams approach for handling this issue.
We also collect many resources for those interested in probabilistic numerics and finally talk about the future of this field.
Transcript
This is an automatic transcript and may therefore contain errors. Pleaseget in touchif you're willing to correct them.
Thu, 10 Aug 2023 - 1h 11min - 90 - #87 Unlocking the Power of Bayesian Causal Inference, with Ben Vincent
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meI’ll be honest — this episode is long overdue. Not only because Ben Vincent is a friend, fellow PyMC Labs developer, and outstanding Bayesian modeler. But because he works on so many fascinating topics — so I’m all the happier to finally have him on the show!
In this episode, we’re gonna focus on causal inference, how it naturally extends Bayesian modeling, and how you can use the CausalPy open-source package to supercharge your Bayesian causal inference. We’ll also touch on marketing models and the pymc-marketing package, because, well, Ben does a lot of stuff ;)
Ben got his PhD in neuroscience at Sussex University, in the UK. After a postdoc at the University of Bristol, working on robots and active vision, as well as 15 years as a lecturer at the Scottish University of Dundee, he switched to the private sector, working with us full time at PyMC Labs — and that is a treat!
When he’s not working, Ben loves running 5k’s, cycling in the forest, lifting weights, and… learning about modern monetary theory.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Ben’s website: https://drbenvincent.github.io/Ben on GitHub: https://github.com/drbenvincentBen on Twitter: https://twitter.com/inferencelabBen on LinkedIn: https://www.linkedin.com/in/dr-benjamin-vincent-503571127/CausalPy – Causal inference for quasi-experiments: https://causalpy.readthedocs.io/en/latest/PyMC Marketing – Bayesian marketing toolbox in PyMC: https://www.pymc-marketing.io/en/stable/index.htmlPyMC Labs : https://www.pymc-labs.io/products/LBS #23 – Bayesian Stats in Business and Marketing Analytics, with Elea McDonnel Feit: https://learnbayesstats.com/episode/23-bayesian-stats-in-business-and-marketing-analytics-with-elea-mcdonnel-feit/LBS #63 – Media Mix Models & Bayes for Marketing, with Luciano Paz: https://learnbayesstats.com/episode/63-media-mix-models-bayes-marketing-luciano-paz/Abstract
written byChristoph Bamberg
In this podcast episode, our guest, Ben Vincent, a fellow member of PyMC Labs with a PhD in Neuroscience and extensive experience in teaching and data analysis of course, introduces us to CausalPy and PyMC Marketing.
During his academic career, Ben got introduced to Bayesian statistics but, like most academics, did not come across causal inference.
We discuss the importance of a systematic causal approach for important questions like health care interventions or marketing investments.
Although causality is somewhat orthogonal to the choice of statistical approach, Bayesian statistics is a good basis for causal analyses, for example in the for of Directed Acyclical Graphs.
To make causal inference more accessible, Ben developed a Python package called CausalPy, which allows you perform common causal inferences, e.g. working with natural experiments.
Ben was also involved in the development of PyMC Marketing, a package that conveniently bundles important analysis capacities for Marketing. The package focuses on Media Mix Modelling and customer lifetime analysis.
We also talked about his extensive experience teaching statistics at university and current teaching of Bayesian methods in industry. His advice to students is to really engage with your learning material, coding through examples, making the learning more pleasurable and practical.
Transcript
Please note that this is an automated transcript that may contain errors. Feel free toreach outif you're willing to correct them.
Sun, 30 Jul 2023 - 1h 08min - 89 - #86 Exploring Research Synchronous Languages & Hybrid Systems, with Guillaume Baudart
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meThis episode is unlike anything I’ve covered so far on the show. Let me ask you: Do you know what a research synchronous language is? What about hybrid systems? Last try: have you heard of Zelus, or ProbZelus?
If you answered “no” to one of the above, then you’re just like me! And that’s why I invited Guillaume Baudart for this episode — to teach us about all these fascinating topics!
A researcher in the PARKAS team of Inria, Guillaume's research focuses on probabilistic and reactive programming languages. In particular, he works on ProbZelus, a probabilistic extension to Zelus, itself a research synchronous language to implement hybrid systems.
To simplify, Zelus is a modeling framework to simulate the dynamics of systems both smooth and subject to discrete dynamics — if you’ve ever worked with ODEs, you may be familiar with these terms.
If you’re not — great, Guillaume will explain everything in the episode! And I know it might sound niche, but this kind of approach actually has very important applications — such as proving that there are no bugs in a program.
Guillaume did his PhD at École Normale Supérieure, in Paris, working on reactive programming languages and quasi-periodic systems. He then worked in the AI programming team of IBM Research, before coming back to the École Normale Supérieure, working mostly on reactive and probabilistic programming.
In his free time, Guillaume loves spending time with his family, playing the violin with friends, and… cooking!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Guillaume’s website: https://guillaume.baudart.eu/ProbZelus on GitHub: https://github.com/IBM/probzelusZelus docs: https://zelus.di.ens.fr/Short Zelus introduction: https://www.di.ens.fr/~pouzet/bib/hscc13.pdf Guillaume’s course : https://wikimpri.dptinfo.ens-cachan.fr/doku.php?id=cours:c-2-40LBS #74 – Optimizing NUTS and Developing the ZeroSumNormal Distribution, with Adrian Seyboldt: https://learnbayesstats.com/episode/74-optimizing-nuts-developing-zerosumnormal-distribution-adrian-seyboldt/ProbZelus (design, semantics, delayed-sampling): https://dl.acm.org/doi/abs/10.1145/3385412.3386009Semi-symbolic inference: https://dl.acm.org/doi/abs/10.1145/3563347Static analysis for bounded memory inference: https://dl.acm.org/doi/abs/10.1145/3485492Abstract
Guillaume Baudart is researcher at Inria in the PARKAS team at the Département d'Informatique (DI) of the École normale supérieure. He joins us for episode 86 to tell us about ProbZelus, a synchronous probabilistic programming language, that he develops.
We have not covered synchronous languages yet, so, Guillaume gives us some context on this kind of programming approach and how ProbZelus adds probabilistic notions to it.
He explains the advantages of the probabilistic aspects of ProbZelus and what practitioners may profit from it.
For example, synchronous languages are used to program and test autopilots of planes and ensure that they do not have any bugs. ProbZelus may be useful here as Guillaume argues.
Finally, we also touch upon his teaching work and what difficulties he encounters in teaching probabilistic programming.
Transcript
Please note that this is an automated transcript that may contain errors. Feel free toreach outif you're willing to correct them.
Fri, 14 Jul 2023 - 58min - 88 - #85 A Brief History of Sports Analytics, with Jim Albert
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, I am honored to talk with a legend of sports analytics in general, and baseball analytics in particular. I am of course talking about Jim Albert.
Jim grew up in the Philadelphia area and studied statistics at Purdue University. He then spent his entire 41-year academic career at Bowling Green State University, which gave him a wide diversity of classes to teach – from intro statistics through doctoral level.
As you’ll hear, he’s always had a passion for Bayesian education, Bayesian modeling and learning about statistics through sports. I find that passion fascinating about Jim, and I suspect that’s one of the main reasons for his prolific career — really, the list of his writings and teachings is impressive; just go take a look at the show notes.
Now an Emeritus Professor of Bowling Green, Jim is retired, but still an active tennis player and writer on sports analytics — his blog, “Exploring Baseball with R”, is nearing 400 posts!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Jim’s website: https://bayesball.github.io/Jim’s baseball blog: https://baseballwithr.wordpress.com/Jim on GitHub: https://github.com/bayesballJim on Twitter: https://twitter.com/albertbayesJim on Linkedin: https://www.linkedin.com/in/jim-albert-22846b41/Jim’s baseball research: https://bayesball.github.io/BLOG/Probability and Bayesian Modeling book: https://monika76five.github.io/ProbBayes/Curve Ball -- Baseball, Statistics, and the Role of Chance in the Game: https://bayesball.github.io/curveball/curveball.htmVisualizing Baseball: https://bayesball.github.io/VB/Analyzing Baseball Data with R: https://www.amazon.com/gp/product/0815353510?pf_rd_p=c2945051-950f-485c-b4df-15aac5223b10&pf_rd_r=SFAV7QEGY9A2EDADZTJ5Teaching Statistics Using Baseball: https://bayesball.github.io/TSUB2/Ordinal Data Modeling: https://link.springer.com/book/10.1007/b98832?changeHeaderWorkshop Statistics (an intro stats course taught from a Bayesian point of view): https://bayesball.github.io/nsf_web/main.htmLBS #76, The Past, Present & Future of Stan, with Bob Carpenter: https://learnbayesstats.com/episode/76-past-present-future-of-stan-bob-carpenter/MCMC Interactive Gallery: https://chi-feng.github.io/mcmc-demo/app.html?algorithm=HamiltonianMC&target=bananaAbstract
written byChristoph Bamberg
In this episode, Jim Albert, a legend of sports analytics, Emeritus Professor at Bowling Green university, is our guest.
We talk about a range of topics, including his early interest in math and sports, challenges in analysing sports data and his experience teaching statistics.
We trace back the history of baseball sport analytics to the 1960s and discuss how new, advanced ways to collect data change the possibilities of what can be modelled.
There are also statistical approaches to American football, soccer and basketball games. Jim explains why these team sports are more difficult to model than baseball.
The conversation then turns to Jim’s substantial experience teaching statistics ad the challenges he sees in that. Jim worked on several books on sports analytics and has many blog posts on this topic.
We also touch upon the challenges of prior elicitation, a topic that has come up frequently in recent podcasts, how different stakeholders such as coaches and managers think differently about the sport and how to extract priors from their information.
For more tune in to episode 85 with Jim Albert.
Chapters
[00:00:00] Episode Begins
[00:04:04] How did you get into the world of statistics?
[00:11:17] Baseball is more advanced on the analytics path compared to other sports
[00:17:02] How is the data collected?
[00:24:43] Why is sports analytics important and is it turning humans into robots?
[00:32:51] Loss in translation problem between modellers and domain experts...?
[00:41:43] Active learning and learning through workshops
[00:51:08] Principles before methods
[00:52:30] Your favorite sports analytics model
[01:02:07] If you had unlimited time and resources which problem would you try to solve?
Transcript
Please note that this transcript is generated automatically and may contain errors. Feel free toreach outif you are willing to correct them.
Tue, 27 Jun 2023 - 1h 06min - 87 - #84 Causality in Neuroscience & Psychology, with Konrad Kording
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meThis is another installment in our neuroscience modeling series! This time, I talked with Konrad Kording, about the role of Bayesian stats in neuroscience and psychology, electrophysiological data to study what neurons do, and how this helps explain human behavior.
Konrad studied at ETH Zurich, then went to UC London and MIT for his postdocs. After a decade at Northwestern University, he is now Penn Integrated Knowledge Professor at the University of Pennsylvania.
As you’ll hear, Konrad is particularly interested in the question of how the brain solves the credit assignment problem and similarly how we should assign credit in the real world (through causality). Building on this, he is also interested in applications of causality in biomedical research.
And… he’s also a big hiker, skier and salsa dancer!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Konrad’s lab: https://kordinglab.com/Konrad’s lab on GitHub: https://github.com/KordingLabKonrad’s lab on Twitter: https://twitter.com/KordingLabLBS #81, Neuroscience of Perception: Exploring the Brain, with Alan Stocker: https://learnbayesstats.com/episode/81-neuroscience-of-perception-exploring-the-brain-alan-stocker/LBS #77, How a Simple Dress Helped Uncover Hidden Prejudices, with Pascal Wallisch: https://learnbayesstats.com/episode/77-how-a-simple-dress-helped-uncover-hidden-prejudices-pascal-wallisch/The Sports Gene, Inside the Science of Extraordinary Athletic Performance: https://davidepstein.com/david-epstein-the-sports-gene/Decoding with good ML: https://github.com/KordingLab/Neural_Decoding and https://www.eneuro.org/content/7/4/ENEURO.0506-19.2020Bayesian decoding: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5578432/Textbook on Bayesian modeling of behavior: bayesianmodeling.comBayesian philosophy: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3981868/Konrad talking about Neuromatch Bayes day: https://www.youtube.com/watch?v=neDaPap_5TgThe Neuromatch Bayes tutorials: compneuro.neuromatch.ioTranscript
Please note that this is an automatic transcript and may contain errors. Feel free toreach outif you would like to correct them.
Tue, 13 Jun 2023 - 1h 05min - 86 - #83 Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
My Intuitive Bayes Online Courses1:1 Mentorship with meOne of the greatest features of this podcast, and my work in general, is that I keep getting surprised. Along the way, I keep learning, and I meet fascinating people, like Tarmo Jüristo.
Tarmo is hard to describe. These days, he’s heading an NGO called Salk, in the Baltic state called Estonia. Among other things, they are studying and forecasting elections, which is how we met and ended up collaborating with PyMC Labs, our Bayesian consultancy.
But Tarmo is much more than that. Born in 1971 in what was still the Soviet Union, he graduated in finance from Tartu University. He worked in finance and investment banking until the 2009 crisis, when he quit and started a doctorate in… cultural studies. He then went on to write for theater and TV, teaching literature, anthropology and philosophy. An avid world traveler, he also teaches kendo and Brazilian jiu-jitsu.
As you’ll hear in the episode, after lots of adventures, he established Salk, and they just used a Bayesian hierarchical model with post-stratification to forecast the results of the 2023 Estonian parliamentary elections and target the campaign efforts to specific demographics.
Oh, and let thing: Tarmo is a fan of the show — I told you he was a great guy ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh and Grant Pezzolesi.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Tarmo on GitHub: https://github.com/tarmojuristoTarmo on Linkedin: https://www.linkedin.com/in/tarmo-j%C3%BCristo-7018bb7/Tarmo on Twitter: https://twitter.com/tarmojuristoSalk website: https://salk.ee/Hierarchical Bayesian Modeling of Survey Data with Post-stratification: https://www.youtube.com/watch?v=efID35XUQ3IAbstract
In episode 83 of the podcast Tarmo Jüristo is our guest. He recently received media attention for his electoral forecasting in the Estonian election and potential positive role in aiding liberal parties gain more votes than expected.
Tarmo explains to us how he used Bayesian models with his NGO SALK to forecast the election and how he leveraged these models to unify the different liberal parties that participated in the election. So, we get a firsthand view of how to use Bayesian modelling smartly.
Furthermore, we talk about when to use Bayesian models, difficulties in modelling survey data and how post-stratification can help.
He also explains how he, with the help of PyMC Labs, added Gaussian Processes to his models to better model the time-series structure of their survey data.
We close this episode by discussing the responsibility that comes with modelling data in politics.
Transcript
please note that the transcript was generated automatically and may therefore contain errors. Feel free toreach outif you're willing to correct them.
Thu, 25 May 2023 - 1h 17min - 85 - #82 Sequential Monte Carlo & Bayesian Computation Algorithms, with Nicolas Chopin
But other methods exist to infer the posterior distributions of your models — like Sequential Monte Carlo (SMC), INLA, Variational Bayes. Let's dive into those in this episode!
Fri, 05 May 2023 - 1h 06min - 84 - #81 Neuroscience of Perception: Exploring the Brain, with Alan Stocker
Did you know that the way your brain perceives speed depends on your priors? And it’s not the same at night? And it’s not the same for everybody? This is another of these episodes I love where we dive into neuroscience, how the brain works, and how it relates to Bayesian stats.
Mon, 24 Apr 2023 - 1h 14min - 83 - #80 Bayesian Additive Regression Trees (BARTs), with Sameer Deshpande
In this episode, we’ll go to the roots of regression trees. Our tree expert will be no one else than Sameer Deshpande. Sameer is an assistant professor of Statistics at the University of Wisconsin-Madison. Prior to that, he completed a postdoc at MIT and earned his Ph.D. in Statistics from UPenn.
Tue, 11 Apr 2023 - 1h 09min - 82 - #79 Decision-Making & Cost Effectiveness Analysis for Health Economics, with Gianluca Baio
Decision-making and cost effectiveness analyses rarely get as important as in the health systems — where matters of life and death are not a metaphor. Bayesian statistical modeling is extremely helpful in this field, with its ability to quantify uncertainty, include domain knowledge, and incorporate causal reasoning.
Fri, 17 Mar 2023 - 1h 07min - 81 - #78 Exploring MCMC Sampler Algorithms, with Matt D. Hoffman
You’ll hear about the circumstances Matt would advise picking up Bayesian stats, generalized HMC, blocked samplers, why do the samplers he works on have food-based names, etc.
Wed, 01 Mar 2023 - 1h 02min - 80 - #77 How a Simple Dress Helped Uncover Hidden Prejudices, with Pascal Wallisch
You remember that dress that looked black and blue to some people, and white and gold to others? Well that’s exactly what we’ll dive into and explain in this episode. Why do we literally see the world differently? Why does that even happen beyond our consciousness, most of the time? And cherry on the cake: how on Earth could this be related to… priors?? Yes, as in Bayesian priors!
Mon, 13 Feb 2023 - 1h 09min - 79 - #76 The Past, Present & Future of Stan, with Bob Carpenter
How does it feel to switch careers and start a postdoc at age 47? How was it to be one of the people who created the probabilistic programming language Stan? What should the Bayesian community focus on in the coming years?
Wed, 01 Feb 2023 - 1h 11min - 78 - #75 The Physics of Top Gun 2 Maverick, with Jason Berndt
If you’re a nerd like me, you’re always curious about the physics of any situation. So, obviously, when I watched Top Gun 2, I became fascinated by the aerodynamics of fighters jets. And it so happens that one of my friends used to be a fighter pilot for the Canadian army… Immediately, I thought this would make for a cool episode — and here we are!
Fri, 20 Jan 2023 - 1h 07min - 77 - #74 Optimizing NUTS and Developing the ZeroSumNormal Distribution, with Adrian Seyboldt
We talked about the why and how of his new project, Nutpie, a more efficient implementation of the NUTS sampler in Rust. We also dived deep into the new ZeroSumNormal distribution he created and that’s available from PyMC 4.2 onwards — what is it? Why would you use it? And when?
Thu, 05 Jan 2023 - 1h 12min - 76 - #73 A Guide to Plotting Inferences & Uncertainties of Bayesian Models, with Jessica Hullman
How to best align data-driven interfaces and representations of uncertainty with human reasoning capabilities
Fri, 23 Dec 2022 - 1h 00min - 75 - #72 Why the Universe is so Deliciously Crazy, with Daniel Whiteson
Proudly sponsored byPyMC Labs, the Bayesian Consultancy.Book a call, orget in touch!
What happens inside a black hole? Can we travel back in time? Why is the Universe even here? This is the type of chill questions that we’re all asking ourselves from time to time — you know, when we’re sitting on the beach.
This is also the kind of questions Daniel Whiteson loves to talk about in his podcast, “Daniel and Jorge Explain the Universe”, co-hosted with Jorge Cham, the author of PhD comics. Honestly, it’s one of my favorite shows ever, so I warmly recommend it. Actually, if you’ve ever hung out with me in person, there is a high chance I started nerding out about it…
Daniel is, of course, a professor of physics, at the University of California, Irvine, and also a researcher at CERN, using the Large Hadron Collider to search for exotic new particles — yes, these are particles that put little umbrellas in their drinks and taste like coconut.
On his free time, Daniel loves reading, sailing and baking — I can confirm that he makes a killer Nutella roll!
Oh, I almost forgot: Daniel and Jorge wrote two books — We Have No Idea and FAQ about the Universe — which, again, I strongly recommend. They are among my all-time favorites.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek and Paul Cox.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
PyMC Labs Meetup, Dec 8th 2022, A Candle in the Dark – How to Use Hierarchical Post-Stratification with Noisy Data: https://www.meetup.com/pymc-labs-online-meetup/events/289949398/Daniel’s website: https://sites.uci.edu/daniel/Daniel on Twitter: https://twitter.com/DanielWhiteson“Daniel and Jorge Explain the Universe”: https://sites.uci.edu/danielandjorge/?pname=danielandjorge.com&sc=dnsredirectWe Have No Idea – A Guide To The Unknown Universe: https://phdcomics.com/noidea/Frequently Asked Questions About The Universe: https://sites.uci.edu/universefaq/Learning to Identify Semi-Visible Jets: https://arxiv.org/abs/2208.10062Twitter thread about the paper above: https://twitter.com/DanielWhiteson/status/1561929005653057536Abstract
Big questions are tackled in episode 72 of the Learning Bayesian Statistics Podcast: “What is the nature of the universe?”, “What is the role of science?”, “How are findings in physics created and communicated?”, “What is randomness actually?”. This episode’s guest, Daniel Whitesun, is just the right person to address these questions.
He is well-known for his own podcast “Daniel and Jorge Explain the Universe”, wrote several popular science books on physics and works as a particle physicist with data from the particle physics laboratory CERN.
He manages to make sense of Astrology, although he is not much of a star-gazer himself. Daniel prefers to look for weird stuff in the data of colliding particles and ask unexpected questions.
This comes with great statistical challenges that he tackles with Bayesian statistics and machine learning, while he also subscribes to the frequentist philosophy of statistics.
In the episode, Alex and Daniel touch upon many of the great ideas in quantum physics, the Higgs boson, Schrödinger’s cat, John Bell’s quantum entanglement discoveries, true random processes and much more. Mixed in throughout are pieces of advice for anyone scientifically-minded and curious about the universe.
Sat, 03 Dec 2022 - 1h 13min - 74 - #71 Artificial Intelligence, Deepmind & Social Change, with Julien Cornebise
This episode knuckles down to different sides of the tech world, research, and application of algorithms where you get super excited about image recognition and AI-generated art.
Mon, 14 Nov 2022 - 1h 05min - 73 - #70 Teaching Bayes for Biology & Biological Engineering, with Justin Bois
Justin bois, Professor of Biology at Caltech, tells us about why stats is helpful for biological studies, and how to best teach it
Sat, 22 Oct 2022 - 1h 05min - 72 - #69 Why, When & How to use Bayes Factors, with Jorge TendeiroWed, 05 Oct 2022 - 53min
- 71 - #68 Probabilistic Machine Learning & Generative Models, with Kevin Murphy
Kevin Murphy tells us what it's like working at Google Brain, and about his work on generative models, optimization and probabilistic machine learning
Wed, 14 Sep 2022 - 1h 05min - 70 - #67 Exoplanets, Cool Worlds & Life in the Universe, with David Kipping
Is there life in the Universe? It doesn’t get deeper than this, does it? And yet, why do we care about that? In the very small chance that there is other life in the Universe, we have even less chance to discover it, talk to it and meet it. So, why do we care?
Wed, 31 Aug 2022 - 1h 00min - 69 - #66 Uncertainty Visualization & Usable Stats, with Matthew Kay
A tour of human-computer interactions and information visualization, especially in uncertainty visualization.
Wed, 17 Aug 2022 - 1h 01min - 68 - #65 PyMC, Aeppl, & Aesara: the new cool kids on the block, with Ricardo Vieira
A (random) walk through the inner workings of the newly released version of PyMC, with one of its main core developers
Wed, 03 Aug 2022 - 1h 05min - 67 - #64 Modeling the Climate & Gravity Waves, with Laura Mansfield
Did you know about gravity waves? That’s right, waves in the sky due to gravity — sounds awesome, right?
Wed, 20 Jul 2022 - 1h 07min - 66 - #63 Media Mix Models & Bayes for Marketing, with Luciano PazTue, 28 Jun 2022 - 1h 14min
- 65 - #62 Bayesian Generative Modeling for Healthcare, with Maria SkoularidouWed, 08 Jun 2022 - 57min
- 64 - #61 Why we still use non-Bayesian methods, with EJ Wagenmakers
The big problems with classic hypothesis testing are well-known. And yet, a huge majority of statistical analyses are still conducted this way. Why is it? Why are things so hard to change? Can you even do (and should you do) hypothesis testing in the Bayesian framework?
Thu, 19 May 2022 - 1h 16min - 63 - #60 Modeling Dialogues & Languages, with J.P. de Ruiter
Using Bayesian statistics to improve our understanding of how humans and artificial agents use language, gesture and other types of signals to effectively communicate with each other.
Sat, 30 Apr 2022 - 1h 12min - 62 - #59 Bayesian Modeling in Civil Engineering, with Michael Faber
How to govern and manage risks, resilience and sustainability in the built environment... with Bayesian statistics!
Thu, 14 Apr 2022 - 59min - 61 - #58 Bayesian Modeling and Computation, with Osvaldo Martin, Ravin Kumar and Junpeng Lao
Get a hands-on approach, focusing on the practice of applied statistics. And you'll see how to use diverse libraries, like PyMC, Tensorflow Probability, ArviZ, Bambi, and so on!
Mon, 21 Mar 2022 - 1h 09min - 60 - #57 Forecasting French Elections, with… Mystery Guest
Alex made us discover new methods, new ideas, and mostly new people. But what do we really know about him? Does he even really exist? To find this out I put on my Frenchest beret, a baguette under my arm, and went undercover to try to find him.
Thu, 03 Mar 2022 - 1h 21min - 59 - #56 Causal & Probabilistic Machine Learning, with Robert Osazuwa Ness
How do you make sure your computer doesn’t just happily (and mistakenly) report correlations as causations? That’s when causal and probabilistic machine learning enter the stage, as Robert Ness will tell us...
Wed, 16 Feb 2022 - 1h 08min - 58 - #55 Neuropsychology, Illusions & Bending Reality, with Dominique Makowski
A nerdy journey through scientific methodology, history of art, religion, and philosophy — what else?
Mon, 31 Jan 2022 - 1h 13min - 57 - #54 Bayes in Theoretical Ecology, with Florian Hartig
A discussion about theoretical ecology, computer simulations and machine learning in ecology and evolution
Fri, 14 Jan 2022 - 1h 08min - 56 - #53 Bayesian Stats for the Behavioral & Neural Sciences, with Todd Hudson
Why is Bayes useful in the behavioral and neural sciences? How to model behavioral and neural data with Bayesian statistics? How to estimate measurement error and compare models?
Tue, 28 Dec 2021 - 56min - 55 - #52 Election forecasting models in Germany, with Marcus Gross
How do you design a forecasting model that's tailored to Germany's electoral system? And then how do you communicate about what it can tell you... and cannot tell you?
Thu, 09 Dec 2021 - 58min - 54 - #51 Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton
About statistical illogic and the crisis of modern science. We talked about a catastrophic error in the logic of the standard statistical methods in almost all the sciences and why this error manifests even outside of science, like in medicine, law, public policy...
Mon, 22 Nov 2021 - 1h 09min - 53 - #50 Ta(l)king Risks & Embracing Uncertainty, with David SpiegelhalterSat, 06 Nov 2021 - 1h 04min
- 52 - #49 The Present & Future of Baseball Analytics, with Ehsan Bokhari
What working in the stats department of a baseball team looks like, how Bayesian are baseball analytics, which pushbacks does Ehsan get, and what the future of baseball analytics look like to him
Fri, 22 Oct 2021 - 1h 12min - 51 - #48 Mixed Effects Models & Beautiful Plots, with TJ Mahr
In short, TJ wrangles data, crunches numbers, plots pictures, and fits models to study how children learn to speak and communicate. On his website, he often writes about Bayesian models, mixed effects models, functional programming in R, or how to plot certain kinds of data.
Fri, 08 Oct 2021 - 1h 01min - 50 - #47 Bayes in Physics & Astrophysics, with JJ RubyTue, 21 Sep 2021 - 1h 15min
- 49 - #46 Silly & Empowering Statistics, with Chelsea Parlett-Pelleriti
How to empower people to do their own statistical analyses, and how to use statistics on behavioral data
Mon, 30 Aug 2021 - 1h 13min - 48 - #45 Biostats & Clinical Trial Design, with Frank Harrell
A deep and broad conversation about predictive models, model validation, Bayesian clinical trial design and Bayesian models, drug development, and clinical research
Tue, 10 Aug 2021 - 1h 08min - 47 - #44 Building Bayesian Models at scale, with Rémi Louf
Going from physics to philosophy to Bayes, working on open-source projects, and developing Bayesian models at scale
Thu, 22 Jul 2021 - 1h 15min - 46 - #43 Modeling Covid19, with Michael Osthege & Thomas Vladeck
Why modeling Covid is so challenging, fascinating, and... a wonderful example of Bayesian generative modeling!
Thu, 08 Jul 2021 - 1h 22min - 45 - #42 How to Teach and Learn Bayesian Stats, with Mine Dogucu
We often talk about applying Bayesian statistics on this podcast. But how do we teach them? What’s the best way to introduce them from a young age and make sure the skills students learn in the stats class are transferable?
Thu, 24 Jun 2021 - 1h 06min - 44 - #41 Thinking Bayes, with Allen Downey
What’s new in the second edition, which mistakes Allen's students most commonly make, and... a surprise!
Mon, 14 Jun 2021 - 1h 04min - 43 - #40 Bayesian Stats for the Speech & Language Sciences, with Allison Hilger and Timo Roettger
We all know about these accidental discoveries — penicillin, the heating power of microwaves, or the famous (and delicious) tarte tatin. I don’t know why, but I just love serendipity. And, as you’ll hear, this episode is deliciously full of it…
Thanks to Allison Hilger and Timo Roettger, we’ll discover the world of linguistics, how Bayesian stats are helpful there, and how Paul Bürkner’s BRMS package has been instrumental in this field. To my surprise — and perhaps yours — the speech and language sciences are pretty quantitative and computational!
As she recently discovered Bayesian stats, Allison will also tell us about the challenges she’s faced from advisors and reviewers during her PhD at Northwestern University, and the advice she’d have for people in the same situation.
Allison is now an Assistant Professor at the University of Colorado Boulder. The overall goal in her research is to improve our understanding of motor speech control processes, in order to inform effective speech therapy treatments for improved speech naturalness and intelligibility. Allison also worked clinically as a speech-language pathologist in Chicago for a year. As a new Colorado resident, her new hobbies include hiking, skiing, and biking — and then reading or going to dog parks when she’s to tired.
Holding a PhD in linguistics from the University of Cologne, Germany, Timo is an Associate Professor for linguistics at the University of Oslo, Norway. Timo tries to understand how people communicate their intentions using speech – how are speech signals retrieved; how do people learn and generalize? Timo is also committed to improving methodologies across the language sciences in light of the replication crisis, with a strong emphasis on open science.
Most importantly, Timo loves hiking, watching movies or, even better, watching people play video games!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt and Andrew Moskowitz.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Allison's website: https://allisonhilger.com/Allison on Twitter: https://twitter.com/drahilgerAllison's motor speech lab: https://www.colorado.edu/lab/motor-speech/Timo's website: https://www.simplpoints.com/Timo on Twitter: https://twitter.com/TimoRoettgerBayesian regression modeling (for factorial designs) -- A tutorial: https://psyarxiv.com/cdxv3An Introduction to Bayesian Multilevel Models Using brms -- A Case Study of Gender Effects on Vowel Variability in Standard Indonesian: https://biblio.ugent.be/publication/8624552/file/8624553.pdfLongitudinal Growth in Intelligibility of Connected Speech From 2 to 8 Years in Children With Cerebral Palsy -- A Novel Bayesian Approach: https://pubs.asha.org/doi/10.1044/2020_JSLHR-20-00181LBS #35 The Past, Present & Future of BRMS, with Paul Bürkner: https://www.learnbayesstats.com/episode/35-past-present-future-brms-paul-burknerLBS #16 Bayesian Statistics the Fun Way, with Will Kurt: https://www.learnbayesstats.com/episode/16-bayesian-statistics-the-fun-way-with-will-kurtWill Kurt's Bayesian Statistics The Fun Way: https://nostarch.com/learnbayesLBS #20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://www.learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtariRegression and Other Stories examples: https://avehtari.github.io/ROS-Examples/Fri, 28 May 2021 - 1h 05min - 42 - #39 Survival Models & Biostatistics for Cancer Research, with Jacki Buros
Episode sponsored by Tidelift:tidelift.com
It’s been a while since we talked about biostatistics and bioinformatics on this podcast, so I thought it could be interesting to talk to Jacki Buros — and that was a very good idea!
She’ll walk us through examples of Bayesian models she uses to, for instance, work on biomarker discovery for cancer immunotherapies. She’ll also introduce you to survival models — their usefulness, their powers and their challenges.
Interestingly, all of this will highlight a handful of skills that Jacki would try to instill in her students if she had to teach Bayesian methods.
The Head of Data and Analytics at Generable, a state-of-the-art Bayesian platform for oncology clinical trials, Jacki has been working in biostatistics and bioinformatics for over 15 years. She started in cardiology research at the TIMI Study Group at Harvard Medical School before working in Alzheimer’s Disease genetics at Boston University and in biomarker discovery for cancer immunotherapies at the Hammer Lab. Most recently she was the Lead Biostatistician at the Institute for Next Generation Health Care at Mount Sinai.
An open-source enthusiast, Jacki is also a contributor to Stan and rstanarm, and the author of the survivalstan package, a library of Stan models for survival analysis.
Last but not least, Jacki is an avid sailor and skier!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt and Andrew Moskowitz.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Nominate "Learn Bayes Stats" as "Best Podcast of 2021" and "Best Tech Podcast" by entering its Apple feed in this form!Jacki on Twitter: https://twitter.com/jackiburosJacki on GitHub: https://github.com/jburosJacki on Orcid: https://orcid.org/0000-0001-9588-4889survivalstan -- Survival Models in Stan: https://github.com/hammerlab/survivalstanrstanarm -- R model-fitting functions using Stan: http://mc-stan.org/rstanarm/Generable -- Bayesian platform for oncology clinical trials: https://www.generable.com/StanCon 2020 ArviZ presentation : https://github.com/arviz-devs/arviz_misc/tree/master/stancon_2020Thinking in Bets -- Making Smarter Decisions When You Don't Have All the Facts : https://www.goodreads.com/book/show/35957157-thinking-in-betsScott Kelly and his space travels (in French): https://www.franceculture.fr/emissions/la-methode-scientifique/la-methode-scientifique-mardi-30-janvier-2018Bayesian Workflow paper: https://arxiv.org/pdf/2011.01808v1.pdfBayesian Survival Analysis: https://www.springer.com/gp/book/9780387952772Bayesian Survival Analysis Using the rstanarm R Package: https://arxiv.org/pdf/2002.09633.pdfSurvival Analysis, A Self-Learning Text: https://www.springer.com/gp/book/9781441966452Survival and Event History Analysis, A Process Point of View: https://www.springer.com/gp/book/9780387202877Prognostic Significance of Tumor-Infiltrating B Cells and Plasma Cells in Human Cancer: https://clincancerres.aacrjournals.org/content/24/24/6125Fri, 14 May 2021 - 59min - 41 - #38 How to Become a Good Bayesian (& Rap Artist), with Baba Brinkman
Episode sponsored by Tidelift:tidelift.com
Imagine me rapping: "Let me show you how to be a good Bayesian. Change your predictions after taking information in, and if you’re thinking I’ll be less than amazing, let’s adjust those expectations!"
What?? Nah, you’re right, I’m not as good as Baba Brinkman. Actually, the best to perform « Good Bayesian » live on the podcast would just be to invite him for an episode… Wait, isn’t that what I did???
Well indeed! For this episode, I had the great pleasure of hosting rap artist, science communicator and revered author of « Good Bayesian », Baba Brinkman!
We talked about his passion for oral poetry, his rap career, what being a good rapper means and the difficulties he encounters to establish himself as a proper rapper.
Baba began his rap career in 1998, freestyling and writing songs in his hometown of Vancouver, Canada.
In 2000 he started adapting Chaucer’s Canterbury Tales into original rap compositions, and in 2004 he premiered a one man show based on his Master’s thesis, The Rap Canterbury Tales, exploring parallels between hip-hop music and medieval poetry.
Over the years, Baba went on to create “Rap Guides” dedicated to scientific topics, like evolution, consciousness, medicine, religion, and climate change – and I encourage you to give them all a listen!
By the way, do you know the common point between rap and evolutionary biology? Well, you’ll have to tune in for the answer… And make sure you listen until the end: Baba has a very, very nice surprise for you!
A little tip: if you wanna enjoy it to the fullest, I put the unedited video version of this interview in the show notes ;) By the way, let me know if you like these video live streams — I might just do them again if you do!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski and Tim Radtke.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Video live-stream of the episode: https://www.youtube.com/watch?v=YkFXpP_SvHkBaba on Twitter: https://twitter.com/bababrinkmanBaba on YouTube: https://www.youtube.com/channel/UCz9Qm66ewnY0LAlZlL4HK9gBaba on Spotify: https://open.spotify.com/artist/7DqKchcLvOIgR87RzJm3XHBaba's website: https://bababrinkman.com/Event Rap Kickstarter: https://www.kickstarter.com/projects/bababrinkman/event-rap-the-one-stop-custom-rap-shopEvent Rap website: https://www.eventrap.com/Anil Seth -- Your Brain Hallucinates your Conscious Reality: https://www.ted.com/talks/anil_seth_your_brain_hallucinates_your_conscious_realityThe Big Picture -- On the Origins of Life, Meaning, and the Universe Itself: https://www.amazon.com/Big-Picture-Origins-Meaning-Universe/dp/1101984252Fri, 30 Apr 2021 - 1h 27min - 40 - #37 Prophet, Time Series & Causal Inference, with Sean Taylor
Episode sponsored by Tidelift:tidelift.com
I don’t know about you, but the notion of time is really intriguing to me: it’s a purely artificial notion; we humans invented it — as an experiment, I asked my cat what time it was one day; needless to say it wasn’t very conclusive… And yet, the notion of time is so central to our lives — our work, leisures and projects depend on it.
So much so that time series predictions represent a big part of the statistics and machine learning world. And to talk about all that, who better than a time master, namely Sean Taylor?
Sean is a co-creator of the Prophet time series package, available in R and Python. He’s a social scientist and statistician specialized in methods for solving causal inference and business decision problems. Sean is particularly interested in building tools for practitioners working on real-world problems, and likes to hang out with people from many fields — computer scientists, economists, political scientists, statisticians, machine learning researchers, business school scholars — although I guess he does that remotely these days…
Currently head of the Rideshare Labs team at Lyft, Sean was a research scientist and manager on Facebook’s Core Data Science Team and did a PhD in information systems at NYU’s Stern School of Business. He did his undergraduate at the University of Pennsylvania, studying economics, finance, and information systems. Last but not least, he grew up in Philadelphia, so, of course, he’s a huge Eagles fan! For my non US listeners, we’re talking about the football team here, not the bird!
We also talked about two of my favorite topics — science communication and epistemology — so I had a lot of fun talking with Sean, and I hope you’ll deem this episode a good investment of your time
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen and Raul Maldonado.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Sean's website: https://seanjtaylor.com/Sean on GitHub: https://github.com/seanjtaylorSean on Twitter: https://twitter.com/seanjtaylorProphet docs: https://facebook.github.io/prophet/Forecasting at Scale -- How and why we developed Prophet for forecasting at Facebook: https://www.youtube.com/watch?v=OaTAe4W9IfA Forecasting at Scale paper: https://www.tandfonline.com/doi/abs/10.1080/00031305.2017.1380080?journalCode=utas20&TimeSeers -- Hierarchical version of Prophet, written in PyMC3: https://github.com/MBrouns/timeseersThe Art of Doing Science and Engineering -- Learning to Learn: https://www.amazon.com/Art-Doing-Science-Engineering-Learning/dp/1732265178NeuralProphet -- Forecasting model based on Neural Networks in PyTorch: https://github.com/ourownstory/neural_prophet/Introducing PyMC Labs: https://www.pymc-labs.io/blog-posts/saving-the-world/Fri, 16 Apr 2021 - 1h 06min - 39 - #36 Bayesian Non-Parametrics & Developing Turing.jl, with Martin Trapp
Episode sponsored by Tidelift:tidelift.com
I bet you already heard of Bayesian nonparametric models, at least on this very podcast. We already talked about Dirichlet Processes with Karin Knudson on episode 4, and then about Gaussian Processes with Elizaveta Semenova on episode 21. Now we’re gonna dive into the mathematical properties of these objects, to understand them better — because, as you may know, Bayesian nonparametrics are quite powerful but also very hard to fit!
Along the way, you’ll learn about probabilistic circuits, sum-product networks and — what a delight — you’ll hear from the Julia community! Indeed, my guest for this episode is no other than… Martin Trapp!
Martin is a core developer of Turing.jl, an open-source framework for probabilistic programming in Julia, and a post-doc in probabilistic machine learning at Aalto University, Finland.
Martin loves working on sum-product networks and Bayesian non-parametrics. And indeed, his research interests focus on probabilistic models that exploit structural properties to allow efficient and exact computations while maintaining the capability to model complex relationships in data. In other words, Martin’s research is focused on tractable probabilistic models.
Martin did his MsC in computational intelligence at the Vienna University of Technology and just finished his PhD in machine learning at the Graz University of Technology. He doesn’t only like to study the tractability of probabilistic models — he also is very found of climbing!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen and Raul Maldonado.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Martin's website: https://trappmartin.github.io/Martin on GitHub: https://github.com/trappmartinMartin on Twitter: https://twitter.com/martin_trappTuring, Bayesian inference with Julia: https://turing.ml/dev/Hierarchical Dirichlet Processes: https://people.eecs.berkeley.edu/~jordan/papers/hdp.pdfThe Automatic Statistician: https://www.doc.ic.ac.uk/~mpd37/teaching/2014/ml_tutorials/2014-01-29-slides_zoubin2.pdfTruncated Random Measures: https://arxiv.org/abs/1603.00861Deep Structured Mixtures of Gaussian Processes: https://arxiv.org/abs/1910.04536Probabilistic Circuits -- Representations, Inference, Learning and Theory: https://www.youtube.com/watch?v=2RAG5-L9R70Applied Stochastic Differential Equations, from Simo Särkkä and Arno Solin: https://users.aalto.fi/~asolin/sde-book/sde-book.pdfTue, 30 Mar 2021 - 1h 09min - 38 - #35 The Past, Present & Future of BRMS, with Paul Bürkner
Episode sponsored by Tidelift:tidelift.com
One of the most common guest suggestions that you dear listeners make is… inviting Paul Bürkner on the show! Why? Because he’s a member of the Stan development team and he created BRMS, a popular R package to make and sample from Bayesian regression models using Stan. And, as I like you, I did invite Paul on the show and, well, that was a good call: we had an amazing conversation, spanning so many topics that I can’t list them all here!
I asked him why he created BRMS, in which fields it’s mostly used, what its weaknesses are, and which improvements to the package he’s currently working on. But that’s not it! Paul also gave his advice to people realizing that Bayesian methods would be useful to their research, but who fear facing challenges from advisors or reviewers.
Besides being a Bayesian rockstar, Paul is a statistician working as an Independent Junior Research Group Leader at the Cluster of Excellence SimTech at the University of Stuttgart, Germany. Previously, he has studied Psychology and Mathematics at the Universities of Münster and Hagen and did his PhD in Münster about optimal design and Bayesian data analysis, and he also worked as a Postdoctoral researcher at the Department of Computer Science at Aalto University, Finland.
So, of course, I asked him about the software-assisted Bayesian workflow that he’s currently working on with Aki Vehtari, which led us to no less than the future of probabilistic programming languages…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen and Jonathan Sedar.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Paul's website: https://paul-buerkner.github.io/about/Paul on Twitter: https://twitter.com/paulbuerknerPaul on GitHub: https://github.com/paul-buerknerBRMS docs: https://paul-buerkner.github.io/brms/Stan docs: https://mc-stan.org/Bayesian workflow paper: https://arxiv.org/pdf/2011.01808v1.pdfFri, 12 Mar 2021 - 1h 07min - 37 - #34 Multilevel Regression, Post-stratification & Missing Data, with Lauren Kennedy
Episode sponsored by Tidelift:tidelift.com
We already mentioned multilevel regression and post-stratification (MRP, or Mister P) on this podcast, but we didn’t dedicate a full episode to explaining how it works, why it’s useful to deal with non-representative data, and what its limits are. Well, let’s do that now, shall we?
To that end, I had the delight to talk with Lauren Kennedy! Lauren is a lecturer in Business Analytics at Monash University in Melbourne, Australia, where she develops new statistical methods to analyze social science data. Working mainly with R and Stan, Lauren studies non-representative data, multilevel modeling, post-stratification, causal inference, and, more generally, how to make inferences from the social sciences.
Needless to say that I asked her everything I could about MRP, including how to choose priors, why her recent paper about structured priors can improve MRP, and when MRP is not useful. We also talked about missing data imputation, and how all these methods relate to causal inference in the social sciences.
If you want a bit of background, Lauren did her Undergraduates in Psychological Sciences and Maths and Computer Sciences at Adelaide University, with Danielle Navarro and Andrew Perfors, and then did her PhD with the same advisors. She spent 3 years in NYC with Andrew Gelman’s Lab at Columbia University, and then moved back to Melbourne in 2020. Most importantly, Lauren is an adept of crochet — she’s already on her third blanket!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege and Rémi Louf.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Lauren's website: https://jazzystats.com/Lauren on Twitter: https://twitter.com/jazzystatsLauren on GitHub: https://github.com/lauken13Improving multilevel regression and poststratification with structured priors: https://arxiv.org/abs/1908.06716Using model-based regression and poststratification to generalize findings beyond the observed sample: https://arxiv.org/abs/1906.11323Lauren's beginners Bayes workshop: https://github.com/lauken13/Beginners_Bayes_WorkshopMRP in RStanarm: https://github.com/lauken13/rstanarm/blob/master/vignettes/mrp.RmdChoosing your rstanarm prior with prior predictive checks: https://github.com/stan-dev/rstanarm/blob/vignette-prior-predictive/vignettes/prior-pred.RmdMister P -- What’s its secret sauce?: https://statmodeling.stat.columbia.edu/2013/10/09/mister-p-whats-its-secret-sauce/Bayesian Multilevel Estimation with Poststratification -- State-Level Estimates from National Polls: https://pdfs.semanticscholar.org/2008/bee9f8c2d7e41ac9c5c54489f41989a0d7ba.pdfMRPyMC3 - Multilevel Regression and Poststratification with PyMC3: https://austinrochford.com/posts/2017-07-09-mrpymc3.htmlUsing Hierarchical Multinomial Regression to Predict Elections in Paris districts: https://www.youtube.com/watch?v=EYdIzSYwbSwRegression and Other Stories book: https://www.cambridge.org/fr/academic/subjects/statistics-probability/statistical-theory-and-methods/regression-and-other-stories?format=PB Bayesian Nonparametric Modeling for Causal Inference, by Jennifer Hill: https://www.tandfonline.com/doi/abs/10.1198/jcgs.2010.08162Lauren's Data Ethics course: https://anastasiospanagiotelis.netlify.app/teaching/dataviza2019/lectures/04dataethics/ethicaldatascience#1Thu, 25 Feb 2021 - 1h 12min - 36 - #33 Bayesian Structural Time Series, with Ben Zweig
How do people choose their career? How do they change jobs? How do they even change careers? These are important questions that we don’t have great answers to. But structured data about the dynamics of labor markets are starting to emerge, and that’s what Ben Zweig is modeling at Revelio Labs.
An economist and data scientist, Ben is indeed the CEO of Revelio Labs, a data science company analyzing raw labor data contained in resumes, online profiles and job postings. In this episode, he’ll tell us about the Bayesian structural time series model they built to estimate inflows and outflows from companies, using LinkedIn data — a very challenging but fascinating endeavor, as you’ll hear!
As a lot of people, Ben has always used more traditional statistical models but had been intrigued by Bayesian methods for a long time. When they started working on this Bayesian time series model though, he had to learn a bunch of new methods really quickly. I think you’ll find interesting to hear how it went…
Ben also teaches data science and econometrics at the NYU Stern school of business, so he’ll reflect on his experience teaching Bayesian methods to economics students. Prior to that, Ben did a PhD in economics at the City University of New York, and has done research in occupational transformation and social mobility.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege and Rémi Louf.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Ben's bio: https://www.stern.nyu.edu/faculty/bio/benjamin-zweigRevelio Labs blog: https://www.reveliolabs.com/blog/Predicting the Present with Bayesian Structural Time Series: https://people.ischool.berkeley.edu/~hal/Papers/2013/pred-present-with-bsts.pdfA Hierarchical Framework for CorrectingUnder-Reporting in Count Data: https://arxiv.org/pdf/1809.00544.pdfTensorFlow Probability module for Bayesian structural time series models: https://www.tensorflow.org/probability/api_docs/python/tfp/sts/ Fitting Bayesian structural time series with the bsts R package: https://www.unofficialgoogledatascience.com/2017/07/fitting-bayesian-structural-time-series.htmlCausalImpact, an R package for causal inference using Bayesian structural time-series models: https://cran.r-project.org/web/packages/CausalImpact/vignettes/CausalImpact.htmlFri, 12 Feb 2021 - 57min - 35 - #32 Getting involved into Bayesian Stats & Open-Source Development, with Peadar Coyle
When explaining Bayesian statistics to people who don’t know anything about stats, I often say that MCMC is about walking many different paths in lots of parallel universes, and then counting what happened in all these universes.
And in a sense, this whole podcast is dedicated to sampling the whole distribution of Bayesian practitioners. So, for this episode, I thought we’d take a break of pure, hard modeling and talk about how to get involved into Bayesian statistics and open-source development, how companies use Bayesian tools, and what common struggles and misperceptions the latter suffer from.
Quite the program, right? The good news is that Peadar Coyle, my guest for this episode, has done all of that! Coming to us from Armagh, Ireland, Peadar is a fellow PyMC core developer and was a data science and data engineer consultant until recently – a period during which he has covered all of modern startup data science, from AB testing to dashboards to data engineering to putting models into production.
From these experiences, Peadar has written a book consisting of numerous interviews with data scientists throughout the world – and do consider buying it, as money goes to the NumFOCUS organization, under which many Bayesian stats packages live, like Stan, ArviZ, PyMC, etc.
Now living in London, Peadar recently founded the start-up Aflorithmic, an AI solution that aims at developing personalized voice-first solutions for brands and enterprises. Their technology is also used to support children, families and elderly coping with the mental health challenges of COVID-19 confinements.
Before all that, Peadar studied physics, philosophy and mathematics at the universities of Bristol and Luxembourg. When he’s away from keyboard, he enjoys the outdoors, cooking and, of course, watching rugby!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work athttps://bababrinkman.com/!
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll and Nathaniel Burbank.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
"Matchmaking Dinner" announcement: https://twitter.com/alex_andorra/status/1351136756087734272How to get acces to "Matchmaking Dinner" episodes: https://www.patreon.com/learnbayesstatsPeadar on Twitter: https://twitter.com/springcoilPeadar's website: https://peadarcoyle.com/Peadar on GitHub: https://github.com/springcoilInterviews with Data Scientists -- A discussion of the Industy and the current trends: https://leanpub.com/interviewswithdatascientists/Aflorithmic -- Personalized Audio SaaS Platform: https://www.aflorithmic.ai/Peadar's PyMC3 Primer: https://product.peadarcoyle.com/Wed, 27 Jan 2021 - 53min
Podcasts similaires à Learning Bayesian Statistics
- Global News Podcast BBC World Service
- El Partidazo de COPE COPE
- Herrera en COPE COPE
- The Dan Bongino Show Cumulus Podcast Network | Dan Bongino
- Es la Mañana de Federico esRadio
- La Noche de Dieter esRadio
- Hondelatte Raconte - Christophe Hondelatte Europe 1
- Affaires sensibles France Inter
- Game Theory Explained Ishita Mehra
- La rosa de los vientos OndaCero
- Más de uno OndaCero
- La Zanzara Radio 24
- Les Grosses Têtes RTL
- L'Heure Du Crime RTL
- El Larguero SER Podcast
- Nadie Sabe Nada SER Podcast
- SER Historia SER Podcast
- Todo Concostrina SER Podcast
- 安住紳一郎の日曜天国 TBS RADIO
- TED Talks Daily TED
- The Tucker Carlson Show Tucker Carlson Network
- 辛坊治郎 ズーム そこまで言うか! ニッポン放送
- 飯田浩司のOK! Cozy up! Podcast ニッポン放送
- 武田鉄矢・今朝の三枚おろし 文化放送PodcastQR