Active visual object exploration and recognition with an unmanned aerial vehicle

Uriel Martinez Hernandez, Victor Cedeno-Campos, Adrian Rubio-Solis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Downloads (Pure)

Abstract

In this paper, an active control method for visual object exploration and recognition with an unmanned aerial vehicle is presented. This work uses a convolutional neural network for visual object recognition, where input images are obtained with an unmanned aerial vehicle from multiple objects. The object recognition task is an iterative process actively controlled by a saliency map module, which extracts interesting object regions for exploration. The active control allows the unmanned aerial vehicle to autonomously explore better object regions to improve the recognition accuracy. The iterative exploration task stops when the probability from the convolutional neural network exceeds a decision threshold. The active control is validated with offline and real-time experiments for visual exploration and recognition of five objects. Furthermore, passive exploration is also tested for performance comparison. Experiments show that the unmanned aerial vehicle is capable to autonomously explore interesting object regions. Results also show an improvement in recognition accuracy from 88.14% to 95.66% for passive and active exploration, respectively. Overall, this work offers a framework to allow robots to autonomously decide where to move and look next, to improve the performance during a visual object exploration and recognition task.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks (IJCNN)
Place of PublicationU. S. A.
PublisherIEEE
Number of pages7
ISBN (Electronic)9781728119854
ISBN (Print)9781728119854
DOIs
Publication statusPublished - 30 Sep 2019

Publication series

NameProceedings of the International Joint Conference on Neural Networks
PublisherIEEE
Volume2019-July
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Martinez Hernandez, U., Cedeno-Campos, V., & Rubio-Solis, A. (2019). Active visual object exploration and recognition with an unmanned aerial vehicle. In International Joint Conference on Neural Networks (IJCNN) [8851738] (Proceedings of the International Joint Conference on Neural Networks; Vol. 2019-July). U. S. A.: IEEE. https://doi.org/10.1109/IJCNN.2019.8851738

Active visual object exploration and recognition with an unmanned aerial vehicle. / Martinez Hernandez, Uriel; Cedeno-Campos, Victor; Rubio-Solis, Adrian.

International Joint Conference on Neural Networks (IJCNN). U. S. A. : IEEE, 2019. 8851738 (Proceedings of the International Joint Conference on Neural Networks; Vol. 2019-July).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Martinez Hernandez, U, Cedeno-Campos, V & Rubio-Solis, A 2019, Active visual object exploration and recognition with an unmanned aerial vehicle. in International Joint Conference on Neural Networks (IJCNN)., 8851738, Proceedings of the International Joint Conference on Neural Networks, vol. 2019-July, IEEE, U. S. A. https://doi.org/10.1109/IJCNN.2019.8851738
Martinez Hernandez U, Cedeno-Campos V, Rubio-Solis A. Active visual object exploration and recognition with an unmanned aerial vehicle. In International Joint Conference on Neural Networks (IJCNN). U. S. A.: IEEE. 2019. 8851738. (Proceedings of the International Joint Conference on Neural Networks). https://doi.org/10.1109/IJCNN.2019.8851738
Martinez Hernandez, Uriel ; Cedeno-Campos, Victor ; Rubio-Solis, Adrian. / Active visual object exploration and recognition with an unmanned aerial vehicle. International Joint Conference on Neural Networks (IJCNN). U. S. A. : IEEE, 2019. (Proceedings of the International Joint Conference on Neural Networks).
@inproceedings{0f3407cfffde45559070286027cb634c,
title = "Active visual object exploration and recognition with an unmanned aerial vehicle",
abstract = "In this paper, an active control method for visual object exploration and recognition with an unmanned aerial vehicle is presented. This work uses a convolutional neural network for visual object recognition, where input images are obtained with an unmanned aerial vehicle from multiple objects. The object recognition task is an iterative process actively controlled by a saliency map module, which extracts interesting object regions for exploration. The active control allows the unmanned aerial vehicle to autonomously explore better object regions to improve the recognition accuracy. The iterative exploration task stops when the probability from the convolutional neural network exceeds a decision threshold. The active control is validated with offline and real-time experiments for visual exploration and recognition of five objects. Furthermore, passive exploration is also tested for performance comparison. Experiments show that the unmanned aerial vehicle is capable to autonomously explore interesting object regions. Results also show an improvement in recognition accuracy from 88.14{\%} to 95.66{\%} for passive and active exploration, respectively. Overall, this work offers a framework to allow robots to autonomously decide where to move and look next, to improve the performance during a visual object exploration and recognition task.",
author = "{Martinez Hernandez}, Uriel and Victor Cedeno-Campos and Adrian Rubio-Solis",
year = "2019",
month = "9",
day = "30",
doi = "10.1109/IJCNN.2019.8851738",
language = "English",
isbn = "9781728119854",
series = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "IEEE",
booktitle = "International Joint Conference on Neural Networks (IJCNN)",
address = "USA United States",

}

TY - GEN

T1 - Active visual object exploration and recognition with an unmanned aerial vehicle

AU - Martinez Hernandez, Uriel

AU - Cedeno-Campos, Victor

AU - Rubio-Solis, Adrian

PY - 2019/9/30

Y1 - 2019/9/30

N2 - In this paper, an active control method for visual object exploration and recognition with an unmanned aerial vehicle is presented. This work uses a convolutional neural network for visual object recognition, where input images are obtained with an unmanned aerial vehicle from multiple objects. The object recognition task is an iterative process actively controlled by a saliency map module, which extracts interesting object regions for exploration. The active control allows the unmanned aerial vehicle to autonomously explore better object regions to improve the recognition accuracy. The iterative exploration task stops when the probability from the convolutional neural network exceeds a decision threshold. The active control is validated with offline and real-time experiments for visual exploration and recognition of five objects. Furthermore, passive exploration is also tested for performance comparison. Experiments show that the unmanned aerial vehicle is capable to autonomously explore interesting object regions. Results also show an improvement in recognition accuracy from 88.14% to 95.66% for passive and active exploration, respectively. Overall, this work offers a framework to allow robots to autonomously decide where to move and look next, to improve the performance during a visual object exploration and recognition task.

AB - In this paper, an active control method for visual object exploration and recognition with an unmanned aerial vehicle is presented. This work uses a convolutional neural network for visual object recognition, where input images are obtained with an unmanned aerial vehicle from multiple objects. The object recognition task is an iterative process actively controlled by a saliency map module, which extracts interesting object regions for exploration. The active control allows the unmanned aerial vehicle to autonomously explore better object regions to improve the recognition accuracy. The iterative exploration task stops when the probability from the convolutional neural network exceeds a decision threshold. The active control is validated with offline and real-time experiments for visual exploration and recognition of five objects. Furthermore, passive exploration is also tested for performance comparison. Experiments show that the unmanned aerial vehicle is capable to autonomously explore interesting object regions. Results also show an improvement in recognition accuracy from 88.14% to 95.66% for passive and active exploration, respectively. Overall, this work offers a framework to allow robots to autonomously decide where to move and look next, to improve the performance during a visual object exploration and recognition task.

UR - http://www.scopus.com/inward/record.url?scp=85073231630&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2019.8851738

DO - 10.1109/IJCNN.2019.8851738

M3 - Conference contribution

SN - 9781728119854

T3 - Proceedings of the International Joint Conference on Neural Networks

BT - International Joint Conference on Neural Networks (IJCNN)

PB - IEEE

CY - U. S. A.

ER -