Dynamic Programming and Hamilton–Jacobi–Bellman Equations on Time Scales

Complexity 2020:1-11 (2020)
  Copy   BIBTEX

Abstract

Bellman optimality principle for the stochastic dynamic system on time scales is derived, which includes the continuous time and discrete time as special cases. At the same time, the Hamilton–Jacobi–Bellman equation on time scales is obtained. Finally, an example is employed to illustrate our main results.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 101,795

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2020-12-22

Downloads
15 (#1,243,005)

6 months
6 (#888,477)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references