Back Home
RSS 2019

OIL: Observational Imitation Learning

Guohao Li, Matthias Müller, Vincent Casser, Neil Smith, Dominik L. Michels, Bernard Ghanem

OIL: Observational Imitation Learning

Abstract

Recent work has explored the problem of autonomous navigation by imitating a teacher and learning an end-to-end policy. However, these approaches tend to be sensitive to mistakes by the teacher and do not scale well to other environments or vehicles. To this end, we propose Observational Imitation Learning (OIL), a novel imitation learning variant that supports online training and automatic selection of optimal behavior by observing multiple imperfect teachers. We apply our proposed methodology to the challenging problems of autonomous driving and UAV racing. We train a perception network to predict waypoints from raw image data and use OIL to train another network to predict controls from these waypoints. Extensive experiments demonstrate that our trained network outperforms its teachers, conventional imitation learning and reinforcement learning baselines, and even humans in simulation.

Resources

arXiv: 1803.01129

Citation

@inproceedings{li2019oil,
  title     = {{OIL}: Observational Imitation Learning},
  author    = {Li, Guohao and M{\"{u}}ller, Matthias and Casser, Vincent and Smith, Neil and Michels, Dominik L. and Ghanem, Bernard},
  booktitle = {Robotics: Science and Systems (RSS)},
  year      = {2019}
}
Copied!