Learning motion primitives of object manipulation using Mimesis Model

Bidan Huang, Joanna J. Bryson, Tetsunari Inamura

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

2 Citations (SciVal)
242 Downloads (Pure)

Abstract

In this paper, we present a system to learn manipulation motion primitives from human demonstration. This system, based on the statistical model “Mimesis Model”, provides an easy-to-use human-interface for learning manipulation motion primitives, as well as a natural language interface allowing human to modify and instruct robot motions. The human-demonstrated manipulation motion primitives are initially encoded by Hidden Markov Models (HMM). The models are then projected to a topological space where they are labeled, and their similarities are represented as their distances in the space. We then explore the unknown area in this space by interpolation between known models. New motion primitives are thus generated from the unknown area to meet the new manipulation scenarios. We demonstrate this system by learning bimanual grasping strategies. The implemented system successfully reproduces and generalizes the motion primitives in different grasping scenarios.
Original languageEnglish
Title of host publication2013 IEEE International Conference on Robotics and Biomimetics (ROBIO)
PublisherIEEE
Pages1144-1150
Number of pages7
DOIs
Publication statusPublished - 2013
Event2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013 - Shenzhen, China
Duration: 12 Dec 201314 Dec 2013

Conference

Conference2013 IEEE International Conference on Robotics and Biomimetics, ROBIO 2013
Country/TerritoryChina
CityShenzhen
Period12/12/1314/12/13

Fingerprint

Dive into the research topics of 'Learning motion primitives of object manipulation using Mimesis Model'. Together they form a unique fingerprint.

Cite this