Expression Robust 3D Face Landmarking Using Thresholded Surface Normals

Jiangning Gao, Adrian Evans

Research output: Contribution to journalArticle

10 Citations (Scopus)
48 Downloads (Pure)

Abstract

3D face recognition is an increasing popular modality for biometric authentication, for example in the iPhoneX. Landmarking plays a significant role in region based face recognition algorithms. The accuracy and consistency of the landmarking will directly determine the effectiveness of feature extraction and hence the overall recognition performance. While surface normals have been shown to provide high performing features for face recognition, their use in landmarking has not been widely explored. To this end, a new 3D facial landmarking algorithm based on thresholded surface normal maps is proposed, which is applicable to widely used 3D face databases. The benefits of employing surface normals are demonstrated for both facial roll and yaw rotation calibration and nasal landmarks localization. Results on the Bosphorus, FRGC and BU-3DFE databases show that the detected landmarks possess high within-class consistency and accuracy under different expressions. For several key landmarks the performance achieved surpasses that of state-of-the-art techniques and is also training free and computationally efficient. The use of surface normals therefore provides a useful representation of the 3D surface and the proposed landmarking algorithm provides an effective approach to localising the key nasal landmarks.

Original languageEnglish
Pages (from-to)120-132
Number of pages13
JournalPattern Recognition
Volume78
Early online date17 Jan 2018
DOIs
Publication statusPublished - 1 Jun 2018

Keywords

  • 3D face landmarking
  • Surface normals

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Expression Robust 3D Face Landmarking Using Thresholded Surface Normals'. Together they form a unique fingerprint.

Cite this