Random projection-based partial feature extraction for robust face recognition

Chunfei Ma, June Young Jung, Seung Wook Kim, Sung Jea Ko

Research output: Contribution to journalArticlepeer-review

16 Citations (Scopus)


In this paper, a novel feature extraction method for robust face recognition (FR) is proposed. The proposed method combines a simple yet effective dimensionality increasing (DI) method with an information-preserving dimensionality reduction (DR) method. For the proposed DI method, we employ the rectangle filters which sum the pixel values within a randomized rectangle window on the face image to extract the feature. By convolving the face image with all possible rectangle filters having various locations and scales, the face image in the image space is projected to a very high-dimensional feature space where more discriminative information can be incorporated. In order to significantly reduce the computational complexity while preserving the most informative features, we adopt a random projection method based on the compressed sensing theory for DR. Unlike the traditional holistic-based feature extraction methods requiring the time-consuming data-dependent training procedure, the proposed method has the partial-based and data-independent properties. Extensive experimental results on representative FR databases show that, as compared with conventional feature extraction methods, our proposed method not only achieves the higher recognition accuracy but also shows better robustness to corruption, occlusion, and disguise.

Original languageEnglish
Pages (from-to)1232-1244
Number of pages13
Issue numberPC
Publication statusPublished - 2015 Feb 3


  • Compressed sensing
  • Face recognition
  • Feature extraction
  • Random projection
  • Robustness

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'Random projection-based partial feature extraction for robust face recognition'. Together they form a unique fingerprint.

Cite this