DS
Data Science Decoded

Data Science #33 - The Backpropagation method, Paul Werbos (1980)

November 3, 2025·57 min
Episode Description from the Publisher

On the 33rd episdoe we review Paul Werbos’s “Applications of Advances in Nonlinear Sensitivity Analysis” which presents efficient methods for computing derivatives in nonlinear systems, drastically reducing computational costs for large-scale models. Werbos, Paul J. "Applications of advances in nonlinear sensitivity analysis." System Modeling and Optimization: Proceedings of the 10th IFIP Conference New York City, USA, August 31–September 4, 1981These methods, especially the backward differentiation technique, enable better sensitivity analysis, optimization, and stochastic modeling across economics, engineering, and artificial intelligence. The paper also introduces Generalized Dynamic Heuristic Programming (GDHP) for adaptive decision-making in uncertain environments.Its importance to modern data science lies in laying the foundation for backpropagation, the core algorithm behind training neural networks. Werbos’s work bridged traditional optimization and today’s AI, influencing machine learning, reinforcement learning, and data-driven modeling.

AI Summary coming soon

Sign up to get notified when the full AI-powered summary is ready.

Get Free Summaries →

Free forever for up to 3 podcasts. No credit card required.

Listen to This Episode

Get summaries like this every morning.

Free AI-powered recaps of Data Science Decoded and your other favorite podcasts, delivered to your inbox.

Get Free Summaries →

Free forever for up to 3 podcasts. No credit card required.