In real-world problems, we face lots of uncertainty, especially in the application of Markov process. Our interest exists in reducing state uncertainty inherent in general partially observable Markov decision process (POMDP). In order to reduce uncertainty, it is indispensable that we obtain additional information concerning every state of Markov process. Among various cases with different additional information structure, this study focuses on the case that we obtain uncertain delayed observation of state after one transition.
In other words, this thesis could be considered as Markov decision process with lagged and current partial observations. First, this study develops basic information structure adding lagged observations to a general POMDP. Second, we study a rule for deriving state vector based on the information structure. This special POMDP model is solved on the basis of a modified one-pass algorithm.
These results are illustrated by a decision making problem in trading company that has two alternatives and two sources of information.