A system for quantifying facial symmetry from 3D contour maps based on transfer learning and fast R-CNN

Hsiu Hsia Lin, Tianyi Zhang, Yu Chieh Wang, Chao Tung Yang*, Lun Jou Lo*, Chun Hao Liao, Shih Ku Kuang

*Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

Abstract

Physicians spend much time observing the facial symmetry of patients and collecting various data to arrive at an accurate clinical judgment. This study presents a transfer learning method for evaluating the degree of facial symmetry. The contour map of a face is used as training data, and the training module then classifies and scores the degree of facial symmetry. Our method enables rapid and accurate clinical assessments. In the experiments, we divided 195 contour maps of patients’ faces provided by physicians and then classified the data into four fractional levels based on the average scores of facial symmetry provided by doctors. Subsequently, the facial data were trimmed, ipped, and superimposed. After being processed, the extent of the contour overlap was used as the basis for learning. We used data augmentation to increase the amount of data. Finally, we applied fine-tuning and transfer learning to obtain prediction models, which showed excellent performance.

Original languageEnglish
Pages (from-to)15953-15973
Number of pages21
JournalJournal of Supercomputing
Volume78
Issue number14
DOIs
StatePublished - 09 2022

Bibliographical note

Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

  • Data augmentation
  • Deep learning
  • Facial symmetry
  • Fast R-CNN
  • Transfer learning

Fingerprint

Dive into the research topics of 'A system for quantifying facial symmetry from 3D contour maps based on transfer learning and fast R-CNN'. Together they form a unique fingerprint.

Cite this