Automatic physiognomic analysis by classifying facial component features

Hee Deok Yang, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature.

Original languageEnglish
Title of host publicationProceedings - International Conference on Pattern Recognition
Pages1212-1215
Number of pages4
Volume2
DOIs
Publication statusPublished - 2006 Dec 1
Event18th International Conference on Pattern Recognition, ICPR 2006 - Hong Kong, China
Duration: 2006 Aug 202006 Aug 24

Other

Other18th International Conference on Pattern Recognition, ICPR 2006
CountryChina
CityHong Kong
Period06/8/2006/8/24

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Electrical and Electronic Engineering

Cite this

Yang, H. D., & Lee, S. W. (2006). Automatic physiognomic analysis by classifying facial component features. In Proceedings - International Conference on Pattern Recognition (Vol. 2, pp. 1212-1215). [1699427] https://doi.org/10.1109/ICPR.2006.1196

Automatic physiognomic analysis by classifying facial component features. / Yang, Hee Deok; Lee, Seong Whan.

Proceedings - International Conference on Pattern Recognition. Vol. 2 2006. p. 1212-1215 1699427.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yang, HD & Lee, SW 2006, Automatic physiognomic analysis by classifying facial component features. in Proceedings - International Conference on Pattern Recognition. vol. 2, 1699427, pp. 1212-1215, 18th International Conference on Pattern Recognition, ICPR 2006, Hong Kong, China, 06/8/20. https://doi.org/10.1109/ICPR.2006.1196
Yang HD, Lee SW. Automatic physiognomic analysis by classifying facial component features. In Proceedings - International Conference on Pattern Recognition. Vol. 2. 2006. p. 1212-1215. 1699427 https://doi.org/10.1109/ICPR.2006.1196
Yang, Hee Deok ; Lee, Seong Whan. / Automatic physiognomic analysis by classifying facial component features. Proceedings - International Conference on Pattern Recognition. Vol. 2 2006. pp. 1212-1215
@inproceedings{eff9a47908444261b8ab24fa926fa5fe,
title = "Automatic physiognomic analysis by classifying facial component features",
abstract = "This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5{\%} for all facial components feature.",
author = "Yang, {Hee Deok} and Lee, {Seong Whan}",
year = "2006",
month = "12",
day = "1",
doi = "10.1109/ICPR.2006.1196",
language = "English",
isbn = "0769525210",
volume = "2",
pages = "1212--1215",
booktitle = "Proceedings - International Conference on Pattern Recognition",

}

TY - GEN

T1 - Automatic physiognomic analysis by classifying facial component features

AU - Yang, Hee Deok

AU - Lee, Seong Whan

PY - 2006/12/1

Y1 - 2006/12/1

N2 - This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature.

AB - This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The Active Appearance Model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature.

UR - http://www.scopus.com/inward/record.url?scp=34047223598&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34047223598&partnerID=8YFLogxK

U2 - 10.1109/ICPR.2006.1196

DO - 10.1109/ICPR.2006.1196

M3 - Conference contribution

AN - SCOPUS:34047223598

SN - 0769525210

SN - 9780769525211

VL - 2

SP - 1212

EP - 1215

BT - Proceedings - International Conference on Pattern Recognition

ER -