Loading...
Please wait, while we are loading the content...
Similar Documents
Planning in Markov Stochastic Task Domains
| Content Provider | Open Access Library (OALib) |
|---|---|
| Author | Yong Lin Fillia Makedon United States of America |
| Abstract | In decision theoretic planning, a challenge for Markov decision processes (MDPs) and partially observable Markov decision processes (POMDPs) is, many problem domains contain big state spaces and complex tasks, which will result in poor solution performance. We develop a task analysis and modeling (TAM) approach, in which the (PO)MDP model is separated into a task view and an action view. In the task view, TAM models the problem domain using a task equivalence model, with task-dependent abstract states and observations. We provide a learning algorithm to obtain the parameter values of task equivalence models. We present three typical examples to explain the TAM approach. Experimental results indicate our approach can greatly improve the computational capacity of task planning in Markov stochastic domains. |
| ISSN | 2180124X |
| Journal | International Journal of Artificial Intelligence and Expert Systems |
| Publisher | Computer Science Journals |
| Publisher Date | 2010-01-01 |
| Access Restriction | Open |
| Subject Keyword | Decision-making Task planning Markov decision processes Uncertainty POMDP |
| Content Type | Text |
| Resource Type | Article |