On Numerical Stochastic Optimal Control via Bellman's Dynamic Programming Principle
Abstract
In this work, we present an application of Stochastic Control Theory to the Merton’s portfolio optimization problem. Then, the dynamic programming methodology is applied to reduce the whole problem to solving the well-known HJB (Hamilton-Jacobi-Bellman) equation that arises from the Merton’s portfolio optimization problem subject to the power utility function. Finally, a numerical method is proposed to solve the HJB equation and the optimal strategy. The numerical solutions are compared with the explicit solutions for optimal consumption and investment control policies.
Subject Area
Mathematics
Recommended Citation
Aboagye, Prince Osei, "On Numerical Stochastic Optimal Control via Bellman's Dynamic Programming Principle" (2018). ETD Collection for University of Texas, El Paso. AAI10841432.
https://scholarworks.utep.edu/dissertations/AAI10841432