Plannable Approximations to MDP Homomorphisms: Equivariance under Actions

Conference Paper (2020)
Author(s)

Elise van der Pol (Universiteit van Amsterdam)

Thomas Kipf (Universiteit van Amsterdam)

FA Oliehoek (TU Delft - Interactive Intelligence)

Max Welling (Universiteit van Amsterdam)

Research Group
Interactive Intelligence
Copyright
© 2020 Elise van der Pol, Thomas Kipf, F.A. Oliehoek, Max Welling
More Info
expand_more
Publication Year
2020
Language
English
Copyright
© 2020 Elise van der Pol, Thomas Kipf, F.A. Oliehoek, Max Welling
Research Group
Interactive Intelligence
Pages (from-to)
1431–1439
ISBN (print)
9781450375184
ISBN (electronic)
9781450375184
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This work exploits action equivariance for representation learning in reinforcement learning. Equivariance under actions states that transitions in the input space are mirrored by equivalent transitions in latent space, while the map and transition functions should also commute. We introduce a contrastive loss function that enforces action equivariance on the learned representations. We prove that when our loss is zero, we have a homomorphism of a deterministic Markov Decision Process (MDP). Learning equivariant maps leads to structured latent spaces, allowing us to build a model on which we plan through value iteration. We show experimentally that for deterministic MDPs, the optimal policy in the abstract MDP can be successfully lifted to the original MDP. Moreover, the approach easily adapts to changes in the goal states. Empirically, we show that in such MDPs, we obtain better representations in fewer epochs compared to representation learning approaches using reconstructions, while generalizing better to new goals than model-free approaches.

Files

VanDerPol20AAMAS.pdf
(pdf | 3.24 Mb)
License info not available