- Video lecture 1: Introduction and Semicontractive Examples. Set of Lecture Slides.
- Video lecture 2: Semicontractive Analysis for Stochastic Optimal Control. Set of Lecture Slides.
- Video lecture 3: Extensions to Abstract DP Models. Set of Lecture Slides.
- Video lecture 4: Applications to Stochastic Shortest Path and Other Problems. Set of Lecture Slides.
- Video lecture 5: Algorithms. Set of Lecture Slides.

Videos from a 5-lecture series on Semicontractive Dynamic Programming, a type of methodology, introduced in the research monograph Abstract Dynamic Programming.

The monograph aims at a unified and economical development of the core theory and algorithms of total cost sequential decision problems. Semicontractive DP refers qualitatively to a collection of models where some policies have a regularity/contraction-like property but others do not. They are exemplified by models involving a termination state, such as shortest path-type problems, both deterministic and stochastic.

The lectures focus on research, which is described in recent papers and an on-line 2nd edition of the monograph. The lectures are as follows:

Additional Overview Lectures:

Video from a Oct. 2017 Lecture at UConn on Optimal control, abstract, and semicontractive dynamic programming. Related paper, and set of Lecture Slides.

Video from a May 2017 Lecture at MIT on the solutions of Bellman's equation, Stable optimal control, and semicontractive dynamic programming. Related paper, and set of Lecture Slides.

Accessibility