Back to Search
Start Over
Stochastic Maximum Principle for Mean-Field Type Optimal Control Under Partial Information
- Source :
- IEEE Transactions on Automatic Control. 59:522-528
- Publication Year :
- 2014
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2014.
-
Abstract
- This technical note is concerned with a partially observed optimal control problem, whose novel feature is that the cost functional is of mean-field type. Hence determining the optimal control is time inconsistent in the sense that Bellman's dynamic programming principle does not hold. A maximum principle is established using Girsanov's theorem and convex variation. Some nonlinear filtering results for backward stochastic differential equations (BSDEs) are developed by expressing the solutions of the BSDEs as some Ito's processes. An illustrative example is demonstrated in terms of the maximum principle and the filtering.
- Subjects :
- Mathematical optimization
Girsanov theorem
Differential equation
Conditional probability distribution
Optimal control
Computer Science Applications
Dynamic programming
Stochastic differential equation
Maximum principle
Mathematics::Probability
Control and Systems Engineering
Convex optimization
Electrical and Electronic Engineering
Mathematics
Subjects
Details
- ISSN :
- 15582523 and 00189286
- Volume :
- 59
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Automatic Control
- Accession number :
- edsair.doi...........9833770e596c39d8762f4312957d30c8
- Full Text :
- https://doi.org/10.1109/tac.2013.2273265