SIAM Journal on Control and Optimization, Vol.40, No.3, 824-852, 2001
Stationary Hamilton-Jacobi equations in Hilbert spaces and applications to a stochastic optimal control problem
We study an infinite horizon stochastic control problem associated with a class of stochastic reaction-diffusion systems with coefficients having polynomial growth. The hamiltonian is assumed to be only locally Lipschitz continuous so that the quadratic case can be covered. We prove that the value function V corresponding to the control problem is given by the solution of the stationary Hamilton Jacobi equation associated with the state system. To this purpose we write the Hamilton Jacobi equation in integral form, and, by using the smoothing properties of the transition semigroup relative to the state system and the theory of m-dissipative operators, we show that it admits a unique solution. Moreover, the value function V is obtained as the limit of minima for some approximating control problems which admit unique optimal controls and states.
Keywords:stochastic reaction-diffusion systems;stationary Hamilton-Jacobi-Bellman equations in infinite dimension;infinite horizon stochastic control problems