Learning to Control a Low-Cost Manipulator using Data-Efficient Reinforcement Learning

File Description SizeFormat 
rss2011_revision.pdfAccepted version2.73 MBAdobe PDFDownload
Title: Learning to Control a Low-Cost Manipulator using Data-Efficient Reinforcement Learning
Author(s): Deisenroth, MP
Rasmussen, CE
Fox, D
Item Type: Conference Paper
Abstract: Over the last years, there has been substantial progress in robust manipulation in unstructured environments. The long-term goal of our work is to get away from precise, but very expensive robotic systems and to develop affordable, potentially imprecise, self-adaptive manipulator systems that can interactively perform tasks such as playing with children. In this paper, we demonstrate how a low-cost off-the-shelf robotic system can learn closed-loop policies for a stacking task in only a handful of trials-from scratch. Our manipulator is inaccurate and provides no pose feedback. For learning a controller in the work space of a Kinect-style depth camera, we use a model-based reinforcement learning technique. Our learning method is data efficient, reduces model bias, and deals with several noise sources in a principled way during long-term planning. We present a way of incorporating state-space constraints into the learning process and analyze the learning gain by exploiting the sequential structure of the stacking task.
Publication Date: 30-Jun-2011
URI: http://hdl.handle.net/10044/1/11578
ISBN: 0262517795
9780262519687
Publisher: MIT Press
Journal / Book Title: Proceedings of the International Conference on Robotics: Science and Systems (RSS 2011)
Copyright Statement: © 2011 MIT Press
Conference Name: 2011 Robotics: Science and Systems Conference
Conference Location: Los Angeles, California
Publisher URL: http://www.roboticsproceedings.org/
http://mitpress.mit.edu/books/robotics-12
Start Date: 2011-06-27
Finish Date: 2011-07-01
Conference Place: Los Angeles, California.
Appears in Collections:Computing



Items in Spiral are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commons