Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

Uriel Martinez Hernandez, Adrian Rubio-Solis, Tony J. Prescott

Research output: Contribution to journalArticlepeer-review

9 Citations (SciVal)
31 Downloads (Pure)

Abstract

Humans use information from sensory predictions, together with
current observations, for the optimal exploration and recognition of
their surrounding environment. In this work, two novel adaptive
perception strategies are proposed for accurate and fast exploration of
object shape with a robotic tactile sensor. These strategies called 1)
adaptive weighted prior and 2) adaptive weighted posterior, combine
tactile sensory predictions and current sensor observations to
autonomously adapt the accuracy and speed of active Bayesian perception
in object exploration tasks. Sensory predictions, obtained from a forward
model, use a novel Predicted Information Gain method. These predictions
are used by the tactile robot to analyse `what would have happened' if
certain decisions `would have been made' at previous decision times. The
accuracy of predictions is evaluated and controlled by a confidence
parameter, to ensure that the adaptive perception strategies rely more on
predictions when they are accurate, and more on current sensory
observations otherwise. This work is systematically validated with the
recognition of angle and position data extracted from the exploration of
object shape, using a biomimetic tactile sensor and a robotic platform.
The exploration task implements the contour following procedure used by
humans to extract object shape with the sense of touch. The validation
process is performed with the adaptive weighted strategies and active
perception alone. The adaptive approach achieved higher angle accuracy
(2.8 deg) over active perception (5 deg). The position accuracy was
similar for all perception methods (0.18 mm). The reaction time or number
of tactile contacts, needed by the tactile robot to make a decision, was
improved by the adaptive perception (1 tap) over active perception (5
taps). The results show that the adaptive perception strategies can
enable future robots to adapt their performance, while improving the
trade-off between accuracy and reaction time, for tactile exploration,
interaction and recognition tasks.
Original languageEnglish
Pages (from-to)127-139
Number of pages13
JournalNeurocomputing
Volume382
Early online date5 Dec 2019
DOIs
Publication statusPublished - 21 Mar 2020

Bibliographical note

Funding Information:
The authors would like to thank to the Sheffield Robotics Lab at the University of Sheffield, and the Autonomous System Lab at the University of Bath for the robotic facilities and the technical support provided for this research work.

Publisher Copyright:
© 2019 Elsevier B.V.

Keywords

  • Active and adaptive perception
  • Autonomous tactile exploration
  • Bayesian inference
  • Sensorimotor control

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot'. Together they form a unique fingerprint.

Cite this