On the 33rd episdoe we review Paul Werbos’s “Applications of Advances in Nonlinear Sensitivity Analysis” which presents efficient methods for computing derivatives in nonlinear systems, drastically reducing computational costs for large-scale models. Werbos, Paul J. "Applications of advances in nonlinear sensitivity analysis." System Modeling and Optimization: Proceedings of the 10th IFIP Conference New York City, USA, August 31–September 4, 1981These methods, especially the backward differentiation technique, enable better sensitivity analysis, optimization, and stochastic modeling across economics, engineering, and artificial intelligence. The paper also introduces Generalized Dynamic Heuristic Programming (GDHP) for adaptive decision-making in uncertain environments.Its importance to modern data science lies in laying the foundation for backpropagation, the core algorithm behind training neural networks. Werbos’s work bridged traditional optimization and today’s AI, influencing machine learning, reinforcement learning, and data-driven modeling.
AI Summary coming soon
Sign up to get notified when the full AI-powered summary is ready.
Free forever for up to 3 podcasts. No credit card required.
Data Science #34 - The deep learning original paper review, Hinton, Rumelhard & Williams (1985)
Data Science #32 - A Markovian Decision Process, Richard Bellman (1957)
Data Science #31 - Correlation and causation (1921), Wright Sewall
Data Science #30 - The Bootstrap Method (1977)
Free AI-powered recaps of Data Science Decoded and your other favorite podcasts, delivered to your inbox.
Free forever for up to 3 podcasts. No credit card required.