SIAM Journal on Control and Optimization, Vol.51, No.4, 3235-3257, 2013
STOCHASTIC MINIMUM PRINCIPLE FOR PARTIALLY OBSERVED SYSTEMS SUBJECT TO CONTINUOUS AND JUMP DIFFUSION PROCESSES AND DRIVEN BY RELAXED CONTROLS
In this paper, we consider nonconvex control problems of stochastic differential equations driven by relaxed controls adapted, in the weak star sense, to a current of sigma algebras generated by observable processes. We cover in a unified way both continuous diffusion and jump processes. We present existence of optimal controls before we construct the necessary conditions of optimality (unlike some papers in this area) using only functional analysis. We develop a stochastic Hamiltonian system of equations on a rigorous basis using the semimartingale representation theory and the Riesz representation theorem, leading naturally to the existence of the adjoint process which satisfies a backward stochastic differential equation. In other words, our approach predicts the existence of the adjoint process as a natural consequence of Riesz representation theory ensuring at the same time the (weak star) measurability. This is unlike other papers, where the adjoint process is introduced before its existence is proved. We believe this is one of our major contributions in this paper. We also discuss the realizability of relaxed controls by regular controls using the Krein-Millman theorem. We believe this is another major contribution of this paper. We also believe that our approach is direct and easy to understand following simply the precise logic of functional analysis.
Keywords:stochastic differential equations;continuous diffusion;jump processes;relaxed controls;existence of optimal controls;necessary conditions of optimality