## Abstract

The Transience objective is not to visit any state infinitely often. While this is not possible in finite Markov Decision Process (MDP), it can be satisfied in countably infinite ones, e.g., if the transition graph is acyclic. We prove the following fundamental properties of Transience in countably infinite MDPs.

1. There exist uniformly

2. Optimal strategies for Transience need not exist, even if the MDP is finitely branching. However, if an optimal strategy exists then there is also an optimal MD strategy.

3. If an MDP is universally transient (i.e., almost surely transient under all strategies) then many other objectives have a lower strategy complexity than in general MDPs. E.g.,

1. There exist uniformly

*ε*-optimal MD strategies (memoryless deterministic) for Transience, even in infinitely branching MDPs.2. Optimal strategies for Transience need not exist, even if the MDP is finitely branching. However, if an optimal strategy exists then there is also an optimal MD strategy.

3. If an MDP is universally transient (i.e., almost surely transient under all strategies) then many other objectives have a lower strategy complexity than in general MDPs. E.g.,

*ε*-optimal strategies for Safety and co-Büchi and optimal strategies for {0,1,2}-Parity (where they exist) can be chosen MD, even if the MDP is infinitely branching.Original language | English |
---|---|

Title of host publication | Proceedings of the 32nd International Conference on Concurrency Theory |

Number of pages | 34 |

Volume | 203 |

Publication status | Accepted/In press - 14 Jul 2021 |

Event | 32nd International Conference on Concurrency Theory - Online, Paris, France Duration: 23 Aug 2021 → 27 Aug 2021 https://qonfest2021.lacl.fr/concur21.php |

### Publication series

Name | LIPIcs - Leibniz International Proceedings in Informatics |
---|

### Conference

Conference | 32nd International Conference on Concurrency Theory |
---|---|

Abbreviated title | CONCUR 2021 |

Country/Territory | France |

City | Paris |

Period | 23/08/21 → 27/08/21 |

Internet address |

## Keywords

- Markov decision processes
- Parity
- Transience