Monitoring plan execution in partially observable stochastic worlds

Wang, Minlue (2014). Monitoring plan execution in partially observable stochastic worlds. University of Birmingham. Ph.D.

PDF - Accepted Version

Download (1MB)


This thesis presents two novel algorithms for monitoring plan execution in stochastic partially observable environments. The problems can be formulated as partially-observable Markov decision processes (POMDPs). Exact solutions of POMDP problems are difficult to find due to the computational complexity, so many approximate solutions are proposed instead. These POMDP solvers tend to generate an approximate policy at planning time and execute the policy without any change at run time. Our approaches will monitor the execution of the initial approximate policy and perform plan modification procedure to improve the policy’s quality at run time. This thesis considers two approximate POMDP solvers. One is a translation-based POMDP solver which converts a subclass of POMDP, called quasi-deterministic POMDP (QDET-POMDP) problems into classical planning problems or Markov decision processes (MDPs). The resulting approximate solution is either a contingency plan or an MDP policy that requires full observability of the world at run time. The other is a point-based POMDP solver which generates an approximate policy by utilizing sampling techniques. Study of the algorithms in simulation has shown that our execution monitoring approaches can improve the approximate POMDP solvers overall performance in terms of plan quality, plan generation time and plan execution time.

Type of Work: Thesis (Doctorates > Ph.D.)
Award Type: Doctorates > Ph.D.
College/Faculty: Colleges (2008 onwards) > College of Engineering & Physical Sciences
School or Department: School of Computer Science
Funders: None/not applicable
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science


Request a Correction Request a Correction
View Item View Item


Downloads per month over past year