Sans titre

Necessary and Sufficient Conditions for Optimality

in Singular Control of Stochastic Differential Equations

Brahim Mezerdi1

1King Fahd University of Petroleum nd Minerals,

Departement of Mathematics and Statistics, Dhahran , Saudi Arabia

E-mail: brahim.mezerdi@kfupm.edu.sa

In this talk, we'll present some of the main aspects of singular control of systems driven by stochastic differential equations. In the first part, we'll recall the two major approaches used in stochastic control theory, namely the dynamic programming principle which leads to the so called Hamilton Bellmann Jacobi equation and the Pontriagin maximum principle. The second part will be devoted to recent results on the stochastic maximum principle for singular control problems. We investigate the relationship between the dynamic programming principle and the Pontriagin maximum principle. In particular, we show that the adjoint process coincides with the derivative of the value function. Worked examples will be given.

Keyword(s): stochastic differential equation - stochastic control - dynamic programming - Pontriagin maximum principle