Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

Uriel Martinez Hernandez, Adrian Rubio-Solis, Tony J. Prescott

Research output: Contribution to journalArticle

Abstract

Humans use information from sensory predictions, together with
current observations, for the optimal exploration and recognition of
their surrounding environment. In this work, two novel adaptive
perception strategies are proposed for accurate and fast exploration of
object shape with a robotic tactile sensor. These strategies called 1)
adaptive weighted prior and 2) adaptive weighted posterior, combine
tactile sensory predictions and current sensor observations to
autonomously adapt the accuracy and speed of active Bayesian perception
in object exploration tasks. Sensory predictions, obtained from a forward
model, use a novel Predicted Information Gain method. These predictions
are used by the tactile robot to analyse `what would have happened' if
certain decisions `would have been made' at previous decision times. The
accuracy of predictions is evaluated and controlled by a confidence
parameter, to ensure that the adaptive perception strategies rely more on
predictions when they are accurate, and more on current sensory
observations otherwise. This work is systematically validated with the
recognition of angle and position data extracted from the exploration of
object shape, using a biomimetic tactile sensor and a robotic platform.
The exploration task implements the contour following procedure used by
humans to extract object shape with the sense of touch. The validation
process is performed with the adaptive weighted strategies and active
perception alone. The adaptive approach achieved higher angle accuracy
(2.8 deg) over active perception (5 deg). The position accuracy was
similar for all perception methods (0.18 mm). The reaction time or number
of tactile contacts, needed by the tactile robot to make a decision, was
improved by the adaptive perception (1 tap) over active perception (5
taps). The results show that the adaptive perception strategies can
enable future robots to adapt their performance, while improving the
trade-off between accuracy and reaction time, for tactile exploration,
interaction and recognition tasks.
Original languageEnglish
JournalNeurocomputing
Publication statusAccepted/In press - 9 Oct 2019

Cite this

Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot. / Martinez Hernandez, Uriel; Rubio-Solis, Adrian; Prescott, Tony J. .

In: Neurocomputing, 09.10.2019.

Research output: Contribution to journalArticle

@article{f83e4d72ae7847ad80fcb7106b1bc4e0,
title = "Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot",
abstract = "Humans use information from sensory predictions, together withcurrent observations, for the optimal exploration and recognition oftheir surrounding environment. In this work, two novel adaptiveperception strategies are proposed for accurate and fast exploration ofobject shape with a robotic tactile sensor. These strategies called 1)adaptive weighted prior and 2) adaptive weighted posterior, combinetactile sensory predictions and current sensor observations toautonomously adapt the accuracy and speed of active Bayesian perceptionin object exploration tasks. Sensory predictions, obtained from a forwardmodel, use a novel Predicted Information Gain method. These predictionsare used by the tactile robot to analyse `what would have happened' ifcertain decisions `would have been made' at previous decision times. Theaccuracy of predictions is evaluated and controlled by a confidenceparameter, to ensure that the adaptive perception strategies rely more onpredictions when they are accurate, and more on current sensoryobservations otherwise. This work is systematically validated with therecognition of angle and position data extracted from the exploration ofobject shape, using a biomimetic tactile sensor and a robotic platform.The exploration task implements the contour following procedure used byhumans to extract object shape with the sense of touch. The validationprocess is performed with the adaptive weighted strategies and activeperception alone. The adaptive approach achieved higher angle accuracy(2.8 deg) over active perception (5 deg). The position accuracy wassimilar for all perception methods (0.18 mm). The reaction time or numberof tactile contacts, needed by the tactile robot to make a decision, wasimproved by the adaptive perception (1 tap) over active perception (5taps). The results show that the adaptive perception strategies canenable future robots to adapt their performance, while improving thetrade-off between accuracy and reaction time, for tactile exploration,interaction and recognition tasks.",
author = "{Martinez Hernandez}, Uriel and Adrian Rubio-Solis and Prescott, {Tony J.}",
year = "2019",
month = "10",
day = "9",
language = "English",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",

}

TY - JOUR

T1 - Learning from sensory predictions for autonomous and adaptive exploration of object shape with a tactile robot

AU - Martinez Hernandez, Uriel

AU - Rubio-Solis, Adrian

AU - Prescott, Tony J.

PY - 2019/10/9

Y1 - 2019/10/9

N2 - Humans use information from sensory predictions, together withcurrent observations, for the optimal exploration and recognition oftheir surrounding environment. In this work, two novel adaptiveperception strategies are proposed for accurate and fast exploration ofobject shape with a robotic tactile sensor. These strategies called 1)adaptive weighted prior and 2) adaptive weighted posterior, combinetactile sensory predictions and current sensor observations toautonomously adapt the accuracy and speed of active Bayesian perceptionin object exploration tasks. Sensory predictions, obtained from a forwardmodel, use a novel Predicted Information Gain method. These predictionsare used by the tactile robot to analyse `what would have happened' ifcertain decisions `would have been made' at previous decision times. Theaccuracy of predictions is evaluated and controlled by a confidenceparameter, to ensure that the adaptive perception strategies rely more onpredictions when they are accurate, and more on current sensoryobservations otherwise. This work is systematically validated with therecognition of angle and position data extracted from the exploration ofobject shape, using a biomimetic tactile sensor and a robotic platform.The exploration task implements the contour following procedure used byhumans to extract object shape with the sense of touch. The validationprocess is performed with the adaptive weighted strategies and activeperception alone. The adaptive approach achieved higher angle accuracy(2.8 deg) over active perception (5 deg). The position accuracy wassimilar for all perception methods (0.18 mm). The reaction time or numberof tactile contacts, needed by the tactile robot to make a decision, wasimproved by the adaptive perception (1 tap) over active perception (5taps). The results show that the adaptive perception strategies canenable future robots to adapt their performance, while improving thetrade-off between accuracy and reaction time, for tactile exploration,interaction and recognition tasks.

AB - Humans use information from sensory predictions, together withcurrent observations, for the optimal exploration and recognition oftheir surrounding environment. In this work, two novel adaptiveperception strategies are proposed for accurate and fast exploration ofobject shape with a robotic tactile sensor. These strategies called 1)adaptive weighted prior and 2) adaptive weighted posterior, combinetactile sensory predictions and current sensor observations toautonomously adapt the accuracy and speed of active Bayesian perceptionin object exploration tasks. Sensory predictions, obtained from a forwardmodel, use a novel Predicted Information Gain method. These predictionsare used by the tactile robot to analyse `what would have happened' ifcertain decisions `would have been made' at previous decision times. Theaccuracy of predictions is evaluated and controlled by a confidenceparameter, to ensure that the adaptive perception strategies rely more onpredictions when they are accurate, and more on current sensoryobservations otherwise. This work is systematically validated with therecognition of angle and position data extracted from the exploration ofobject shape, using a biomimetic tactile sensor and a robotic platform.The exploration task implements the contour following procedure used byhumans to extract object shape with the sense of touch. The validationprocess is performed with the adaptive weighted strategies and activeperception alone. The adaptive approach achieved higher angle accuracy(2.8 deg) over active perception (5 deg). The position accuracy wassimilar for all perception methods (0.18 mm). The reaction time or numberof tactile contacts, needed by the tactile robot to make a decision, wasimproved by the adaptive perception (1 tap) over active perception (5taps). The results show that the adaptive perception strategies canenable future robots to adapt their performance, while improving thetrade-off between accuracy and reaction time, for tactile exploration,interaction and recognition tasks.

M3 - Article

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -