Over the past few decades the design of robots has gradually improved, allowing them to perform complex tasks in interaction with the world. To behave appropriately, robots need to make perceptual decisions about their environment using their various sensory modalities. Even though robots are being equipped with progressively more accurate and advanced sensors, dealing with uncertainties from the world and their sensory processes remains an unavoidable necessity for autonomous robotics. The challenge is to develop robust methods that allow robots to perceive their environment while managing uncertainty and optimizing their decision making. These methods can be inspired by the way humans and animals actively direct their senses towards locations for reducing uncertainties from perception . For instance, humans not only use their hands and fingers for exploration and feature extraction but also their movements are guided according to what it is being perceived . This behaviour is also present in the animal kingdom, such as rats that actively explore the environment by appropriately moving their whiskers .